ASCB logo LSE Logo

General Essays and ArticlesFree Access

Supporting Undergraduate Biology Students’ Academic Success: Comparing Two Workshop Interventions

    Published Online:https://doi.org/10.1187/cbe.21-03-0068

    Abstract

    College students’ performance in introductory-level biology course work is an important predictor of ongoing persistence in the major. This study reports on a researcher–educator partnership that designed and compared two cocurricular workshops. Seventeen laboratory sections of an undergraduate biology course were randomly assigned to one of two educational interventions during the regularly scheduled lab class section after students had completed and received the results for the first exam. The baseline Metacognition intervention was an hourlong workshop focused on effective learning strategies and self-awareness in the learning process; the extended Metacognition plus Time Management (Metacognition+TM) intervention included the aforementioned workshop plus a second hourlong workshop on time management and procrastination. Based on three exams and self-report surveys administered before the intervention and at the end of the semester, students who participated in the Metacognition+TM intervention experienced greater increases in their exam scores and degree commitment than those in the baseline intervention. Additionally, group status moderated the effect of the intervention, as the Metacognition+TM intervention was especially effective in increasing use of time management tools by students from minoritized groups.

    INTRODUCTION

    How and when to study are key considerations for effective learning in the field of biology (Rytkonen et al., 2012; Aflalo, 2018). Effective study strategies are especially needed in the college context, where undergraduate students face the demands of increased quality and quantity of learning amid greater flexibility in how they use their time (Hensley et al., 2015; Wolters and Hoops, 2015). Developing effective study strategies is associated with higher grades as well as a higher likelihood of persisting in college, that is, making progress toward degree plans and, ultimately, graduation (Tinto, 1993; Nora, 2004; Kuh et al., 2006; Sebesta and Speth, 2017). Yet both empirical research and instructors’ experiences in the college classroom suggest that many students do not use effective strategies and that their academic engagement, confidence, and achievement may suffer as a result (Rachal et al., 2007; Cholewa and Ramaswami, 2015; Perin and Holschuh, 2019).

    Various interventions have been developed to help address shortcomings in college students’ study strategies, ranging from full-semester courses (e.g., Tuckman and Kennedy, 2011) and summer bridge programs (e.g., Hoops and Kutrybala, 2015) to onetime workshops (e.g., Nordell, 2009) and integrated classroom supports (e.g., Stanton et al., 2015; Sabel et al., 2017). Although participating in a course devoted to effective learning typically provides the most comprehensive experience and opportunities for feedback (Wolters and Hoops, 2015), it may not always be an option for students who take full course loads or who do not realize the need for improved strategies until partway through a semester. For this reason, instructors and other university personnel have developed brief interventions intended to build effective study strategies, such as workshops (Boretz, 2012; Truschel and Reedy, 2009). Academic workshops typically address just one or two dimensions of learning (Wolters and Hoops, 2015), such as using memorization techniques, preparing for finals, or overcoming procrastination (Truschel and Reedy, 2009).

    There has been a growing focus on understanding how to develop students’ metacognitive abilities to support success in college science courses (Cook et al., 2013; Zhao et al., 2014; Stanton et al., 2015; Dye and Stanton, 2017; van Vliet et al., 2015; Sabel et al., 2017; Sebesta and Speth, 2017). Delivering a metacognitive workshop within the structure provided by an undergraduate course is a promising practice, because domain-relevant instruction may increase the likelihood of transfer to both subsequent learning tasks and other contexts (Zepeda et al., 2015). When students learn about effective learning skills in an overly general manner or outside an actionable context, it can be difficult to apply the skills to their future learning efforts; conversely, contextualizing learning skills instruction within a specific discipline provides authenticity and scaffolding that help students develop new behaviors and ultimately apply them in new settings (Hattie and Donoghue, 2016; Bernacki et al., 2020). Although a brief intervention can make a positive impact by teaching students how to assess their understanding and engage in intentional learning (Hoffmann and McGuire, 2010; McGuire, 2015), an emphasis on metacognition alone may be incomplete. That is, students may be metacognitively aware of the quality of their learning, but they may struggle to put their understanding of effective learning strategies into practice if they cannot regulate their time, motivation, or study environment.

    To advance research and practice on the use of workshops to support study strategies and learning, the present study brought together a team of researchers and educators from a life sciences education center and a student learning center. Our project involved evaluating the impacts of two cocurricular workshop interventions on undergraduate biology students’ academic beliefs, strategies, and achievement. By comparing two workshop interventions—one with a metacognitive focus on being self-aware and intentional as a learner and one with additional content on enacting strategies through managing time and overcoming procrastination—we aimed to assist students with their learning and to better understand the impact of cocurricular workshops on academic outcomes.

    Challenges in College Biology: The Need for Metacognition and Time Management

    Introductory biology is a demanding context for many college students, presenting more rigorous expectations than prior education in terms of the depth, quality, and quantity of learning (McCarthy and Kuh, 2006; Yazedjian et al., 2008). College biology students may have academic difficulties for many reasons, but two stand out as the most critical. One, students may not use effective learning strategies, due in part to a lack of metacognitive awareness and skills (Tanner, 2012). Two, students may struggle with managing their time and ensuring that they can devote the time needed to thoroughly complete academic activities and tasks (Klingsieck et al., 2013). In this section, we summarize key literature related to the nature of these challenges and their connections to academic performance.

    For many students, giving thought to their learning strategies may not have been essential before college. Students may engage with content at a surface level and, based on how easily they seem to recognize the content, assume they understand it adequately (Collins and Sims, 2006; Freeman et al., 2007). This illusion of competence is often at the heart of students’ frustrations and disappointments when receiving an unexpected poor grade on an exam; their judgments that they adequately understood the material proved wrong (Larmar and Lodge, 2014). Metacognition, in contrast, involves understanding the learning task, making choices about how to effectively engage in it, and gauging how well these strategies are working. Metacognition includes three knowledge-based components: declarative knowledge about oneself as a learner, procedural knowledge about how to learn, and conditional knowledge about why and when to apply various approaches to learning (Schraw and Moshman, 1995). In addition, it includes three self-regulatory skills: planning how to engage in learning, monitoring the effectiveness of learning processes in the moment, and evaluating the outcomes of learning efforts (Moshman, 2018).

    Reflecting learning principles articulated by Flavell (1979), metacognition involves the ability to “think about one’s own thinking; be consciously aware of oneself as a problem solver; monitor, plan, and control one’s mental processing; and accurately judge one’s level of learning” (McGuire, 2015, p. 17). Through metacognition, students actively engage in monitoring and regulating their learning. They identify what they do and do not understand (metacognitive knowledge), then make adjustments to comprehend or remember more effectively (metacognitive skills; Flavell, 1979; Pintrich, 2002). Examining studies that together included nearly 20,000 college students, a meta-analysis revealed that metacognitive self-regulation was positively associated with course grade and overall grade point average (GPA), as well as with indicators of academic engagement such as intrinsic motivation and elaborative learning strategies (Credé and Phillips, 2011). In addition, findings from a meta-analysis conducted by Broadbent and Poon (2015) suggested that specific self-regulated learning strategies, including some linked to metacognition, were associated with students’ academic performance in online courses.

    Within the biology context, metacognition has been a particular focus. In terms of domain-specific strategies, college students’ use of specific self-regulated learning strategies, especially metacognitive strategies, has been connected to exam performance in introductory biology (Sebesta and Speth, 2017). In interviews with college students enrolled in upper-division biology course work, Dye and Stanton (2017) found a tendency for students to describe the importance of developing metacognitive regulation skills, particularly evaluation, in order to succeed in life sciences. Most interviewed students noted that receiving an unsatisfactory exam grade initially prompted the evaluation of their study strategies (Dye and Stanton, 2017). Researchers have suggested that instructors can integrate supports into biology courses to enhance students’ metacognition (Stanton et al., 2021). For example, reflection assignments, answer keys that include the rationale for answers, and self-evaluations all seem to prompt metacognition (Stanton et al., 2015; Sabel et al., 2017). Stanton et al. (2021) provide further research-based suggestions in their review of metacognition for teachers. Based on prior work, these suggestions include teaching students to engage in metacognition as they prepare for, take, and evaluate their performance on exams, which are often new and challenging learning experiences for college biology students (Dye and Stanton, 2017).

    Time management represents a second key area that is instrumental to success in college and whose absence may lead to academic difficulties. Students who struggle academically tend not to recognize just how much time college-level learning may require and may not have a repertoire of effective tools for managing their time (Freeman et al., 2007). The amount of time students were accustomed to studying in high school—three or fewer hours per week in many cases (McCarthy and Kuh, 2006)—falls short of the recommended 2 to 3 hours per credit hour per week to earn a “C” in college (Nonis et al., 2006). Additional contributors to poor time management include failing to plan or prioritize, becoming distracted by digital media and social opportunities, and waiting until the last minute to study (Claessens et al., 2007; Bembenutty, 2011; Wolters and Brady, 2020). When acting in these ways, students tend to run out of time to use effective learning strategies and do not engage with academic content sufficiently (Hartwig and Dunlosky, 2012).

    A multifaceted concept, time management is understood to include “the awareness that time can be manipulated; the active use of skills, strategies, and tools; creating and maintaining a flexible routine; and the ability to evaluate one’s personal effectiveness” (White et al., 2013, p. 216). Time management also includes overcoming the lure of distractions and procrastination by directing time, effort, and motivation to engage in academic tasks (Klassen et al., 2008). Research consistently associates effective time management with strong academic performance, as measured by overall GPA, as well as individual assignment, exam, and course grades (Landrum et al., 2006; Steel, 2007; Credé and Kuncel, 2008; Credé and Phillips, 2011; Basila, 2014; Broadbent and Poon, 2015). The value of time management for college students is further bolstered by research that demonstrates ties to satisfaction with university life (Krumrei-Mancuso et al., 2013).

    Interventions Designed to Support Academic Success

    The strategies students use to engage in academics are not inherent; instead, they can be taught and developed (Weinstein et al., 2000; Pintrich and Zusho, 2007). As some have argued, learning how to learn not only can be taught but “must be taught” (Gall, 1990, as cited in Tanner, 2012, p. 113). McDaniel and Einstein (2020) suggested that successful learning strategy training must include four components. One, students must learn about the target strategy and how to use that strategy. Two, students must believe that the target strategy will improve their learning. Three, students must feel committed and motivated to use the target strategy. Four, students must receive training to support their future use of a strategy. In particular, understanding when strategies should be used is important for students given the context-specific nature of strategies (Hattie and Donoghue, 2016). Thus, embedding learning strategy training within a specific context might lead to a deeper understanding of the target strategy (Hattie and Donoghue, 2016).

    One type of academic support that has grown in popularity over the past decade is teaching students about metacognition and providing opportunities to engage in metacognition as part of the classroom experience (Bernacki et al., 2020, 2021; Freeman et al., 2011; Tanner, 2012; van Vliet et al., 2015). The overall goal of this approach is to increase students’ awareness and regulation of their learning processes in order to improve their achievement (Zepeda et al., 2015). Researchers have explored different approaches to teaching and prompting students’ metacognition, including instructional modules that students engage in outside class time (Bernacki et al. 2020, 2021; Cogliano et al., 2020), assignments or activities students complete as part of their courses (Stanton et al., 2015; Sabel et al., 2017), and cocurricular workshops delivered during class time (Hoffmann and McGuire, 2010; McGuire, 2015). The accompanying research suggests these approaches tend to be effective. In addition, in a meta-analysis focused on self-regulated learning scaffolds in online learning environments, Zheng (2016) found that both domain-general (i.e., aspects that support overall learning processes) and domain-specific (i.e., aspects that support learning in a particular subject area) scaffolds facilitated students’ self-regulated learning.

    The form of metacognition instruction of interest to the present study is that of a cocurricular workshop delivered during class time, as modeled by McGuire (Hoffmann and McGuire, 2010; McGuire, 2015). Workshops are a group-based educational intervention led by an expert and intended to develop participants’ knowledge and skills on a topic or small set of related topics (Brooks-Harris and Stock-Ward, 1999; Wolters and Hoops, 2015). In two core studies (Cook et al., 2013; Zhao et al., 2014), researchers found that metacognition workshops improved academic achievement in undergraduate science courses. While the results of this research are promising, design-related factors (e.g., lack of a comparison group or adequate controls for pre-intervention characteristics) suggest that more rigorous methods are needed to strengthen arguments for the benefits of metacognitive workshops. Moreover, additional research is needed to understand whether teaching about metacognition is enough, or if teaching about time management provides further benefits that warrant the additional class time.

    Although metacognition is an important component of self-regulated learning, it is just one of the areas that effective learners regulate; the other areas include motivation, behavior, and context (Panadero, 2017; Kim et al., 2020). Engaging in effective time management invokes a fuller range of self-regulatory processes, as it pertains to students’ skills in devoting time and effort toward academic work, including managing distractions in the environment and generating the motivation to get started rather than procrastinate. Thus, content on time management, including the closely related topics of managing distractions and procrastination (Claessens et al., 2007; Wolters and Brady, 2020), may complement and extend workshop content on metacognition.

    As with metacognition, learning about time management seems to be an area where intervention can support college student success. Van der Meer et al. (2010) highlighted the need for instructional efforts to help students understand how to engage in effective time management. In this mixed-methods study, analyses of survey responses and interviews suggested that students were aware that college required a different approach to time management than high school but had difficulty identifying what specific changes might be needed.

    Intervention studies suggest that there are benefits to directly teaching individuals how to improve their time management through approaches that cover topics such as planning, managing distractions, and addressing procrastination. However, such intervention has not been adequately studied in the context of the college classroom. Two small-scale interventions suggested that a 4-hour training on time management could increase sense of control over time, as well as decrease stress (Häfner et al., 2015) and procrastination (Häfner et al., 2014). The commitment of time in these prior interventions may not be feasible for classroom-based application, however, and thus the findings and methods may not be fully transferable. Also, while this prior work suggests that time management interventions can yield desirable psychological and behavioral outcomes, they do not investigate a connection to desirable academic outcomes. Although time management appears to be fundamental to student success in college (Robbins et al., 2004; Credé and Phillips, 2011) and time management workshops are commonplace on college campuses (Truschel and Reedy, 2009; Wolters and Brady, 2020), the academic outcomes of this type of intervention remain underresearched.

    Supporting Students from Minoritized Groups in Science

    For researchers focused on supporting student success in college, these efforts often include a focus on student subpopulations, such as students from ethnic and racial groups that are typically underrepresented in science (May and Chubin, 2003; Toven-Lindsey et al., 2015; Jordt et al., 2017). This underrepresentation of individuals of color in certain fields traces back to inequities and systematic exclusion of certain groups based on ethnicity or race (Asai, 2020b). Racial and ethnic identities that are underrepresented in science in proportion to their makeup of the overall U.S. population include Black or African American, Hispanic or Latinx, and Native American, Hawaiian, and Pacific Islander (National Science Foundation, 2019). The term “underrepresented minority” is often used to refer to these groups (Jordt et al., 2017). In this paper, we use the term “minoritized” (e.g., “students from minoritized groups”) to emphasize the role that social and educational systems have historically played in creating minoritized status and underrepresentation (Matthews and López, 2020).

    In her review of the literature related to enhancing diversity in science, technology, engineering, and mathematics (STEM), Tsui (2016) noted that college students from minoritized backgrounds tended to report challenges in their study skills and time management skills to a greater degree than White or Asian college students. For some students, this difference may relate to precollege preparation, where access to college-preparatory curriculum and resources may not have been widely available (Kuh et al., 2006, 2007). Additionally, in college, students from minoritized groups tend to work more hours in part- or full-time jobs than White or Asian students (Hurtado et al., 2010), which can increase time pressures and reduce time resources. Academic struggles can precede decisions to leave a major or an institution, and some scholars connect the underrepresentation of racial and ethnic minorities in science-related fields to early levels of academic preparation (Lewis, 2003; Chang et al., 2014). Systemic racism in education—specifically, inequities in resources, opportunities, and policies for students from minoritized groups—can be seen as the primary factor underlying differences in students’ academic preparation for college (Kendi, 2019). Thus, how and when students approach their studying are relevant to academic progression in science, yet they must also be considered in the broader context of inclusive and anti-racist education.

    In line with anti-deficit reframing, it is important to pursue lines of research that focus not on calling out underpreparation and academic challenges but rather on how students attain success (Harper, 2010) and how educational strategies work to foster inclusion (Asai, 2020a). Along these lines, life science researchers have emphasized the need for educational interventions that support academic success and broaden opportunities for engagement (Estrada et al., 2016; Lent et al., 2018). Common interventions include summer bridge programs and early arrival programs before students’ first year of college (Estrada et al., 2016), as well as mentoring and research opportunity programs that occur during the undergraduate experience (Toven-Lindsey et al., 2015). As other scholars have noted (e.g., Lewis, 2003; Harper, 2010), such programs can be conceptualized in ways that reinforce the deficit-minded perception that students from minoritized groups are lacking in academic skills. Ideally, however, these programs cultivate students’ existing strengths and serve to enhance equity. Toven-Lindsey et al. (2015), for example, tied their 2-year program to the overarching goal of creating a campus culture that is inclusive and welcoming. It is unclear, however, whether a brief intervention such as a classroom-based workshop might also support positive academic experiences for science students from minoritized backgrounds.

    Present Study

    Instructional efforts that impact achievement by supporting both metacognition and time management may be advantageous for students overall, and especially for students in biology from minoritized backgrounds. The purpose of this study is to examine the impacts of two different workshop interventions on introductory biology students’ academic beliefs, strategies, and achievement. This experimental study compares the impacts of two instructional interventions: a baseline Metacognition workshop and a Metacognition plus Time Management (Metacognition+TM) workshop that added content on managing time and the closely related topics of overcoming distractions and procrastination. We anticipated that both workshops would benefit students, but that students who participated in the extended Metacognition+TM workshop would experience greater benefits in terms of their development of adaptive academic beliefs and strategies, as well as increases in their exam grades. Our research questions were:

    1. In what ways, if any, do biology students show differences in subsequent academic beliefs and strategies based on participation in a Metacognition intervention versus a Metacognition+TM intervention?

    2. In what ways, if any, do biology students show differences in their subsequent exam performance based on participation in a Metacognition intervention versus a Metacognition+TM intervention?

    3. In what ways, if any, do the workshop interventions improve beliefs, strategies, or exam performance differently based on students’ minoritized or majority group status (i.e., are there ways in which a given workshop is more impactful for students who are traditionally underrepresented in biology)?

    METHODS

    We used a pre/posttest design with 17 biology laboratory (lab) sections of ∼24 students each, with students randomly assigned to one of two intervention conditions. The two workshop-based interventions took place during regularly scheduled lab meeting times the week following the release of students’ grades for the first course exam. Exam grades, as well as pre- and post-intervention measures reflecting students’ academic beliefs and strategies, were collected to understand potential impacts of the workshops. In the following sections, we describe the course context and participants, interventions, timeline, measures, and analyses.

    Course Context and Participants

    The workshops and data collection took place at a large, 4-year, public university in the Midwestern United States during the Autumn 2019 semester. The instructional context was a foundational course that required students to understand and apply biological concepts. Specifically, the study took place in a high-enrollment, introductory-level undergraduate course. The course was administered by the life sciences education center and emphasized topics such as cells, energy transfer, genetics, and the chemistry of life. Each week, students spent 3 hours in lecture and an additional 3 hours in lab.

    The study was reviewed and approved by the university’s Institutional Review Board. Because the activities were structured as an educational intervention designed to support student success, all 404 students enrolled in the course were assigned to one of the two interventions. From this total, 249 students (133 Metacognition and 116 Metacognition+TM) consented to participate in the study, representing a 61.6% participation rate. Demographic data, including gender, race/ethnicity, and first-generation status, were obtained from university records. Consistent with other recent research in STEM contexts (e.g., Canning et al., 2019), students identifying as White or Asian students were considered majority status. Students who reported another racial or ethnic group, including Black and/or Hispanic, were considered minoritized status. Table 1 reports the demographic characteristics.

    TABLE 1. Gender, minoritized or majority group status, and first-generation status for participants

    CharacteristicaTotal n (%)Metacognition n (%)Metacognition+TM n (%)
    Gender
     Female147 (59.0%)79 (59.4)68 (58.6)
     Male102 (41.0)54 (40.6)48 (41.4)
    Race/ethnicity
     Minoritized30 (12.0)14 (10.5)16 (13.8)
     Majority213 (85.5)115 (86.5)98 (84.5)
     Not able to determine6 (2.4)4 (3.0)2 (1.7)
    First-generation status
     First generation63 (25.3)30 (22.6)33 (28.4)
     Continuing generation186 (74.7)103 (77.4)83 (71.6)

    aThe students from minoritized groups category includes students who identified as Black/African American or Hispanic/Latinx. No students in the sample identified as Native American, Hawaiian, and Pacific Islander. The majority status students category includes students who identified as White or Asian. The not able to determine category pertained to students who identified as two or more races/ethnicities, non-resident aliens, or unknown/not disclosed. N = 249.

    Timeline

    During the semester, students completed pre- and post-intervention self-report surveys, as well as three exams. The first exam took place in week 5 of the semester. Two days later, students completed a consent form and were assigned an online, self-report pre-intervention survey of their academic beliefs and strategies. The next week (week 6), students received their exam grades and attended their regularly scheduled lab sections. During these required lab sections, students participated in either the Metacognition intervention or the Metacognition+TM intervention, based on random assignment of each lab section to an intervention. Students completed the second exam in week 10. Students completed the online, self-report post-intervention survey of their academic beliefs and strategies in the last week of the semester (week 16). The day after the 4-day window for completing the survey closed, students completed the third exam. Below, we describe the interventions in depth.

    Workshop Interventions

    Two workshop interventions were examined in the study: Metacognition, which focused on being self-aware as a learner and developing learning strategies for different levels of Bloom’s taxonomy; and Metacognition+TM, which included this core material plus additional content on time management and procrastination. Based on contemporary classroom-based research (Cook et al., 2013; Zhao et al., 2014) and earlier scholarly reviews about the benefits of teaching learning skills (Hattie et al., 1996; National Research Council, 2000), the research team determined that the metacognition workshop would serve as the baseline content delivered to all students, as having a no-workshop control group would place students at a disadvantage. Limited pilot work in the biology course in a prior semester suggested that students perceived value in the metacognition workshop but that their behaviors were not impacted beyond the very short term. These early findings led the instructor to develop booster assignments (described in this section) as a mechanism to support students in practicing and applying the knowledge and skills explored in the intervention as they moved forward in the course.

    As informed by McGuire’s (2015) model of teaching students how to learn effectively, both workshop interventions were held in students’ regular classroom (lab) setting following the results of the first exam. This approach emphasized the academic rigor of the workshop, communicated the instructor’s support for the content, and conveyed information to students when they were receptive to feedback on how they studied (McGuire, 2015). The workshops were designed collaboratively by members of the life sciences education center and student learning center, who held advanced degrees in biology education and educational psychology. The team developed the two workshops over several months, which included reviewing the literature, preparing drafts of workshop slides and activities, and conducting a test run and feedback session of each workshop in its near-final form. Each workshop was accompanied by a worksheet that provided space for students to take notes and identified the major takeaways. Additionally, in both workshops, the presenter asked questions to maintain engagement and prompted students to discuss the content with one another.

    On the day of the intervention, students came to their regularly scheduled lab sections and participated in an hourlong metacognition workshop led by their instructor. Based on their lab’s intervention condition, students were then either released from lab (Metacognition intervention) or participated in a second hourlong workshop after a 10-minute break (Metacognition+TM intervention). The content of each workshop is described in the specific intervention sections below.

    Metacognition Intervention.

    For the Metacognition portion of the intervention, the course instructor presented a 1-hour workshop titled “Strategies to Get the Grade You Want in Bio [Course Number]!” The content was based on the model provided by McGuire (2015), which has been tested in prior classroom-based research (Cook et al., 2013; Zhao et al., 2014). The workshop focused on effective learning strategies and self-awareness in the learning process. The three main sections were: 1) matching the task with the tool, 2) applying the metacognitive process (planning, monitoring, and evaluating) as a way of purposefully learning, and 3) selecting learning strategies based on the depth of learning desired, following Bloom’s taxonomy of educational objectives (Bloom et al., 1956).

    In the first section, the instructor began by sharing feedback from former students about how learning about effective strategies had helped them to improve their grades. Students then reflected on whether they had ever performed poorly on an exam, even when they had worked hard to feel prepared for it. To introduce the importance of matching a task with the correct tool, students engaged in an active-learning exercise in which they brainstormed what they might do next when approaching a massive redwood tree with just a handsaw. Then, students were prompted to consider how they would prepare differently for the task if they could go back and try again. The purpose of the activity was to set up students to recognize how the example might be analogous to their own classroom experiences: if they are not using effective and efficient learning strategies, they are less likely to succeed.

    In the second section, students considered the difference between studying and learning (McGuire, 2015). The key takeaway was that going through content with the purpose of truly understanding was more effortful—but also more valuable (Rovers et al., 2018). To help students prepare to learn at this level, the instructor defined metacognition as thinking about your own thinking (Dimmitt and McCormick, 2012) and described a three-step process for engaging in metacognition. Focusing on the metacognitive aspects of a three-phase model of self-regulated learning (Zimmerman and Schunk, 2001), this process included identifying learning strategies, monitoring learning as it took place, and evaluating the effectiveness of the approach, often via reflection, in order to guide adjustments and plan more effectively for the next study task.

    In the third section, the instructor introduced the levels of Bloom’s taxonomy, ranging from basic (e.g., remembering or understanding) to increasingly more complex (e.g., applying, analyzing, or evaluating). The instructor asked students to consider the level of learning required in college as opposed to high school (McGuire, 2015), provided examples of exam questions, and asked students to identify the level of learning required in order to reinforce the idea that memorization and basic understanding were often insufficient in college biology. The instructor concluded by elaborating upon how planning, monitoring, and evaluating could help students approach learning as a metacognitive process.

    Metacognition+TM Intervention.

    In the lab sections in which the Metacognition+TM intervention took place, students first attended the 1-hour Metacognition workshop. Following a 10-minute break, the course instructor introduced the learning center representative, who presented a 1-hour workshop titled “Taking Control of Your Time Management and Procrastination.” The additional material for the Metacognition+TM intervention was informed by van Eerde and Klingsieck’s (2018) meta-analysis of procrastination intervention studies, as well as relevant research on learning, time management, and motivational regulation (Wolters, 2003a,b; Wolters and Brady, 2020). The three main sections were 1) using calendars and task lists, 2) removing distractions, and 3) implementing procrastination management strategies.

    The first section began with a group discussion of how time management in college was different from time management in high school. Then, the presenter introduced the curve of forgetting (Roediger et al., 2010) and explained how frequent study sessions spaced out over time were more effective than cramming (Hopkins et al., 2016). To aid in planning, students learned about creating an integrated master syllabus that brought together the readings, topics, assignments, and exams from all their classes. To counteract being overwhelmed and not knowing where to start, students learned to translate their running list of all tasks into a manageable set of daily objectives, with a focus on identifying their top three tasks each day and working on the top priority task first (Van Eerde, 2000). Students also learned to incorporate study tasks into their calendaring system to increase the likelihood of follow-through (Steel and Konig, 2006).

    The second section of the workshop emphasized the importance of the study environment in terms of promoting either focus or inefficiency (Dewitte and Schouwenburg, 2002). Students learned to assess their common distractions and manage the study environment to reduce distractions. Here, the presenter introduced specific techniques to increase focus, such as the pomodoro technique (Oakley, 2014), and apps that shut out distractions and rewarded time on task.

    The third and final section of the workshop addressed procrastination specifically. Students discussed when they typically procrastinated, and the presenter emphasized common causes for procrastination connected with students’ examples, such as low self-efficacy (Wolters, 2003b), a need for immediate gratification (Harrington, 2005), and low motivation (Howell and Buro, 2009). Students learned to identify and counteract rationalizations (i.e., excuses) typically used to justify procrastination (Tuckman, 2005), as well as to implement strategies to address procrastination, such as seeking help, breaking tasks into smaller pieces, using positive self-talk, and adding structure through self-set rewards or peer accountability (Wolters and Benzon, 2013).

    Booster Assignments.

    To support students’ application of the content from the workshops, the instructor introduced an extra-credit assignment opportunity. These optional booster assignments gave students weekly opportunities to plan and reflect on their study strategies. At the beginning of the week, students listed three or more strategies they would commit to using to improve their learning. At the end of the week, students listed the strategies they actually used and reflected on how well the strategies seemed to work for them. In this way, the booster assignments provided a form of accountability that encouraged students to regularly engage with strategies learned during the workshop(s). Students submitted the optional assignments online each week, and the instructor provided reminders a few times during the semester to encourage students to continue to use the activities to improve their ability to be successful in the course. We designed this intervention to motivate students to modify their study behaviors and develop self-regulation. We thought that mandatory reflective exercises could be counterproductive, as prior research has suggested that choice is an important element in students’ motivation and mental engagement (Reeve, 2002; Niemiec and Ryan, 2009). Ultimately, we chose to let the students decide whether to complete this activity, with the extra credit as an incentive for engaging in a task designed to develop new skills (Williams and Stockdale, 2004). Students from both intervention groups completed a similar average number of booster assignments; therefore, any effect of completing them should have similarly impacted the Metacognition and Metacognition+TM students’ post-intervention survey responses.

    Measures

    Students’ prior academic performance data were gathered from university records. Pre- and post-intervention surveys assessed students’ self-reported academic beliefs and strategies. The two surveys, with 68 items each, were built into the course as assignments to provide credit for student participation. Each survey provided ∼1% of the total course points (2% total for both surveys). All items were closed-ended and used a five-point Likert-type response scale where, unless otherwise indicated, strongly disagree = 1 and strongly agree = 5. Because the self-report measures included items with wording adapted from the original items, we conducted factor analyses to determine the underlying factor structure in the present study. The measures and, where applicable, factors are defined in the Academic Beliefs and Academic Strategies sections below. A description of the factor analyses appears in the subsequent section.

    Academic Backgrounds.

    Students’ standardized test scores and beginning-of-semester cumulative GPA were acquired from university records. Standardized test scores reflected either the composite ACT score or the equivalent SAT–ACT score conversion, using 2018 concordance tables (ACT, 2018). Students’ academic backgrounds are summarized in Supplemental Table S1.

    Academic Beliefs.

    Incremental and entity beliefs about intelligence (Dweck, 1999), also known as mindsets, reflected students’ beliefs about whether their intelligence was malleable. Students’ incremental and entity beliefs before the intervention were used as covariates to control for students’ expectations about whether they would be able to increase their knowledge or ability to learn, which might influence receptivity to the workshop content. In prior studies with college students, incremental beliefs about intelligence predicted students’ mastery goal orientation (Lou et al., 2017) and entity beliefs predicted students’ external academic locus of control, that is, beliefs that their outcomes were determined by other people or events (Bodill and Roberts, 2013). Consistent with recent research, this scale contained two factors that were distinct but not strictly opposite from one another. A sample item for incremental beliefs (four items) was “You can always substantially change how intelligent you are.” A sample item for entity beliefs (four items) was: “You can learn new things, but you can’t really change your basic intelligence.”

    Self-efficacy for self-regulated learning, derived from the original (Bandura, 2006; Usher and Pajares, 2007), assessed how confident students were in their ability to complete certain academic activities successfully. In previous studies of college students, this scale was correlated with student enthusiasm as well as mastery goal orientation, that is, the desire to learn and improve (Gerhardt and Brown, 2006). The eight items asked students to gauge their confidence for activities such as to “fully understand the information presented in class and textbooks.” The response scale ranged from not confident = 1 to very confident = 5.

    College commitments, from the College Persistence Questionnaire (Davidson et al., 2009), assessed students’ commitment to their current institution or to earning a college degree. Institutional and degree commitments are predictors of persistence (Davidson et al., 2009), reflecting the important role that students’ academic goal intentions play in college retention (Nora, 2004). Aligning with the questionnaire’s subscales, the items reflected two factors pertaining to different aspects of students’ commitment to continuing college, both at a specific institution and as an overall aim. A sample item from the institutional commitment factor (four items) was “How confident are you that [university name] is the right university for you?” A sample item from the degree commitment factor (three items) was “At this moment in time, how strong would you say your commitment is to earning a college degree, at [university name] or elsewhere?” The response scale had different labels corresponding with the type of question (e.g., 1 = not confident, 5 = very confident; 1 = not strong, 5 = very strong).

    Academic Strategies.

    Motivational regulation strategies (Brief Regulation of Motivation Scale; Kim et al., 2018) assessed students’ use of strategies to maintain their effort toward and interest in studying and schoolwork (eight items). In prior research with college students, this scale was related to but empirically distinct from the regulation of other aspects of learning, such as cognition, and was predictive of lower procrastination and higher course grades (Kim et al., 2020). A sample item was “If I feel like stopping before I’m really done, I have strategies to keep myself studying.”

    Metacognitive strategies, derived from the original scale in the Motivated Strategies for Learning Questionnaire (Pintrich et al., 1993), gauged the degree to which students engaged in metacognition by taking an active role in understanding and directing their learning processes. In previous studies of college students, this scale predicted lower amounts of procrastination (Wolters, 2003b) and was associated with high amounts of intrinsic motivation (Young, 2005). Unlike in prior research that assessed metacognition unidimensionally (Wolters, 2003b; Young, 2005) or in terms of three underlying components of planning, monitoring, and evaluating (Donker et al., 2014), in the present study, this scale contained two separate factors based on students’ metacognitive engagement either before or during/after learning activities. A sample item for the planning factor (three items) was “Before I begin to study, I plan out what I want to get done.” A sample item for the monitoring and evaluating factor (seven items) was “As I study, I frequently check to make sure I really understand the material.”

    Procrastination, derived from the original (Tuckman, 1991), measured students’ tendencies to postpone academic work or miss deadlines. In prior studies with college students, this scale was predicted by low self-efficacy (Hensley, 2014) and negatively associated with course grade (Hensley, 2014) and cumulative GPA (Jackson et al., 2003). In the present study, this scale contained two separate factors based on behavior patterns, on the one hand, and the outcomes of procrastination, on the other. A sample item for the postponement factor (seven items) was “I postpone getting started on things I don’t like to do.” A sample item for the missed deadlines factor (two items) was “I often don’t get assignments done on time.”

    Time management strategies, derived from the original (Macan, 1994), measured students’ tendencies to prioritize and schedule their time. In previous studies of college students, the scale was negatively correlated with aspects of engaged learning, such as value for learning, metacognitive strategies, and motivational strategies (Wolters et al., 2017). In the present study, the time management items reflected two separate factors based on students’ general tendency to organize and prioritize their use of time, on the one hand, and use of tools to keep track of their tasks and schedules, on the other. The intentional time use factor (15 items) included items such as “I set priorities to determine the order in which I will perform schoolwork each day.” The time management tools factor (three items) included items such as “I use a personal calendar to keep track of important events, obligations, or deadlines.”

    Factor Analyses and Alphas.

    Seven exploratory factor analyses (EFA) using principal axis factoring were examined in SPSS to observe the underlying factor structure of all scales and assess for any suboptimal items. This data-driven approach made no assumptions of the pattern of relationships in the data and was used to explore the dimensionality of each scale (Knekta et al., 2019). Moreover, this method allowed us to explore the underlying dimensionality of these scales and assess for low factor loadings and high cross-factor loadings when administered to a sample of biology students, whereas previous scale-validation efforts took place outside the STEM context. Factors were retained based on Kaiser’s criterion, scree plot analyses, and the use of rotation methods to achieve a simple, theoretically meaningful factor solution (Worthington and Whittaker, 2006). Based on the EFA models, one item was deleted from the procrastination subscale and one item was deleted from self-efficacy for self-regulated learning based on low factor loadings and high cross-loadings. Confirmatory factor analyses (CFA) for the full sample were then examined using Mplus (v. 8.4) to evaluate the validity of the scales and test the factor solutions obtained from our EFAs (Worthington and Whittaker, 2006). The CFAs were primarily used as an assessment of our measurement model before we proceeded to the main structural specification of our structural equation models (SEM); thus we did not use a cross-validation procedure, typically used in scale-validation studies (e.g., Vodanovich et al., 2005), which would have split the sample in half for the EFA and CFA.

    For several of the scales, factor analyses corroborated the presence of two distinct factors. Specifically, factor analyses aligned with recent research that identified the two-factor structure of procrastination in a separate sample (Wolters et al., 2020), as well as prior research that identified the distinctiveness of different elements of college commitments (Davidson et al., 2009) and the two-factor structure of beliefs about intelligence (Tempelaar et al., 2015). New to the current analyses was the identification of the two-factor structures of metacognitive strategies and time management strategies. These two-factor structures were retained based on the examination of the scree plot, clearly patterned factor loadings, and conceptual cohesion among the items making up each factor. Although factor analyses suggested some possibility of a two-factor structure for self-efficacy for self-regulated learning, the high amount of cross-loadings between multiple items and lack of distinct overarching constructs for separate factors supported keeping the items as a single scale.

    Alphas and fit indices for the scales and underlying factors appear in Table 2. Internal consistency estimates were assessed using coefficient alpha, a commonly used statistic to establish levels of internal consistency (Netemeyer et al., 2003). Obtained alphas for all scales except for the pre-intervention college commitment factors indicated good internal consistency of above 0.70. The college commitment factors displayed acceptable yet slightly low consistency values on the pre-intervention survey, owing in part to the low number of items and to the possibility that students had some variance at the beginning of the semester in their own views of commitment to the institution and to earning a degree. Because these scales’ reliability was not prohibitively low, and because the scales had exhibited strong reliability and validity with larger samples across multiple institutions in prior research (Davidson et al., 2009), we retained them for use in the present study.

    TABLE 2. Alphas and fit indices for the scales and underlying factors for both pre- and post-intervention surveys

    ScaleAlphaχ2 (df)RMSEACFISRMR
    Beliefs about Intelligence (pre)0.86 (Incremental Beliefs),0.89 (Entity Beliefs)46.892 (19)0.0770.9760.033
    Self-Efficacy for Self-Regulated Learning (pre)0.86135.189 (32)0.1140.8860.055
    Self-Efficacy for Self-Regulated Learning (post)0.87144.112 (32)0.1190.8950.057
    Goal Commitment (pre)0.66 (Institutional Commitment),0.68 (Degree Commitment)18.478 (11)0.0520.9830.047
    Goal Commitment (post)0.70 (Institutional Commitment),0.71 (Degree Commitment)13.616 (11)0.0310.9950.025
    Motivational Regulation (pre)0.8122.414 (20)0.0220.9950.029
    Motivational Regulation (post)0.8447.634 (20)0.0740.9590.041
    Metacognitive Strategies (pre)0.70 (Planning),0.81 (Monitoring and Evaluating)64.26 (33)0.0620.9540.045
    Metacognitive Strategies (post)0.74 (Planning),0.85 (Monitoring and Evaluating)82.457 (33)0.0780.9460.043
    Procrastination (pre)0.90 (Postponement),0.78 (Missed Deadlines)73.909 (26)0.0710.9720.041
    Procrastination (post)0.90 (Postponement),0.87 (Missed Deadlines)97.046 (26)0.8800.9620.045
    Time Management Strategies (pre)0.85 (Intentional Time Use),0.84 (Time Management Tools)189.54 (132)0.0420.9550.048
    Time Management Strategies (post)0.85 (Intentional Time Use),0.81 (Time Management Tools)288.375(132)0.0690.9000.058

    Measurement Invariance.

    As a goal of this study was to examine differences between students from majority and minoritized backgrounds, we conducted tests of measurement invariance to ensure that the questionnaire items measured the same theoretical constructs similarly in both groups. Tests of measurement invariance are commonly used as a prerequisite for group comparison. If measurement invariance is not tenable, analyses of our measures do not produce meaningful results, as findings of differences between groups cannot be unambiguously interpreted (Horn and McArdle, 1992). We considered three measurement invariance steps: 1) configural, equivalence of model form; 2) metric, equivalence of factor loadings; and 3) scalar, equivalence of item intercepts (Putnick and Bornstein, 2016; Wang and Wang, 2019).

    Measurement invariance was tested using a multi-group CFA. Measurement invariance was tenable for nearly all scales, as all scales displayed scalar invariance, except metacognitive strategies and time management. Notably, time management originally displayed metric non-invariance; there was no significant difference between the configural model and metric model, χ2 (16) = 10.78, p = 0.82; however, there was a significant difference between the metric model and scalar model, χ2 (16) = 27.8, p = 0.03. Additionally, metacognitive strategies originally displayed metric non-invariance; there was a significant difference between the configural model and metric model, χ2 (8) = 15.78, p = 0.045. In accordance with Putnick and Bornstein (2016), we omitted items with noninvariant intercepts and loadings and retested the configural, metric, and scalar invariance models. Consequently, one item was dropped from monitoring and evaluating and one item was dropped from intentional time use to obtain adequate scalar invariance. Additional statistics concerning the scales can be found in Supplemental Table S2.

    Exams.

    Academic achievement was based on students’ grades from three course exams. Exam 1 was administered before the workshops and thus provided a baseline of student performance. Exam 2 reflected students’ achievement for material covered 1 month after the workshops, and exam 3 reflected students’ achievement for the final third of the course. Each exam covered approximately one-third of the course content and was worth 11% of the final grade in the course. Exams consisted of multiple-choice questions written to assess one or more course learning outcomes and sub-outcomes at specific levels of Bloom’s taxonomy. Though course concepts built on one another as the term progressed, each exam focused on different concepts explored throughout the course. Two versions of each exam were administered, both with the same questions, but with the questions ordered differently. Cronbach’s alphas were similar between forms at each time point and demonstrated good internal consistency. For the two forms of each exam, the following alphas were observed: exam 1a, α = 0.88; exam 1b, α = 0.86; exam 2a, α = 0.84; exam 2b, α = 0.85; exam 3a, α = 0.87, exam 3b, α = 0.89. For our analyses, we used percent scores computed by dividing the number of items answered correctly by the total items on each exam. For exam 1, the range of scores earned by students was 23.15–96.30 (M = 70.95, SD = 15.13). For exam 2, the range was 34.00–100.00 (M = 79.37, SD = 12.83). For exam 3, the range was 35.29–100.00 (M = 77.64, SD = 15.19).

    Booster Assignments.

    Students were able to complete up to a total of 10 planning assignments and 10 reflection assignments. The instructor reviewed the online submissions weekly and awarded extra credit at the end of semester. Each was worth 0.3 percentage points toward the final course grade, for a maximum of 6 percentage points (20 assignments) awarded at the end of the semester. The total number of points gained by each student on the booster assignments was included to account for intervention dosage (Lazowski and Hulleman, 2016) by means of students’ ongoing engagement with the workshop concepts. The range of scores was 0.00–6.00 (M = 4.26, SD = 1.81).

    Analyses

    To compare the effectiveness of the two workshops, we examined changes in self-reported academic beliefs and strategies, as well as students’ exam grades. The analyses included preliminary analyses as well as SEM to examine both main effects and interactions based on minoritized group status. With the SEMs that examined the efficacy of the Metacognition and Metacognition+TM treatments, we used a residualized change approach (e.g., Pittman and Richmond, 2008) to determine whether intervention type was associated with changes in any of the academic belief, strategy, or achievement outcomes, that is, to determine whether there were group differences in the amount of change. Subsequently, each post-intervention outcome was regressed on the intervention condition (Metacognition or Metacognition+TM), its pre-intervention score, and a set of relevant covariates accounting for students’ demographic and academic backgrounds. Our focal variable (intervention condition, Metacognition+TM in tables) therefore expressed differences between the Metacognition and Metacognition+TM groups at post intervention while holding constant the pre-intervention score and other included covariates. Given our limited sample size, we modeled each distinct factor in its own separate SEM. An example model using the time management scale can be found in Supplemental Figure S1. The advantages of SEM as opposed to alternative methods included: estimates of measurement error in all variables, incorporation of both observed and latent variables, and estimation of indirect effects (Bryne, 2006). Missing data were handled using a maximum-likelihood estimation method. We adopted the Benjamini-Hochberg correction (Benjamini and Hochberg, 1995) to control for the potential inflated type I error rate due to the multiple comparisons in this study.

    Covariates included gender, minoritized group status, first-generation status, ACT score, GPA, and exam 1 score. The analyses also controlled for incremental and entity beliefs about intelligence measured on the pre-intervention survey, to account for the degree to which students might expect their intelligence to be able to change, as well as booster assignment completion, to account for students’ level of ongoing engagement with intervention content. In addition, all analyses of self-reported academic strategies and beliefs controlled for the respective pre-intervention score (e.g., motivational regulation at the end of the semester controlled for students’ motivational regulation beliefs measured before the intervention).

    RESULTS

    Preliminary Analyses

    Associations among the Variables.

    The correlations between the survey and exam measures are available in Supplemental Table S3. Although the correlation between institutional commitment on the pre- and post-intervention surveys was moderate and significant, it was notably weaker than the other correlations on the diagonal, suggesting that this construct may be less stable or more amenable to change over the course of a semester as compared with the other constructs. The overall pattern of associations between survey measures and exam scores affirmed the academically adaptive nature of the academic beliefs and strategies examined in the study, as well as the maladaptive nature of procrastination. In addition, the extent to which students completed the optional booster assignments over the second two-thirds of the course had positive correlations with nearly all of the self-reported belief and strategy variables and (not shown in the table) with exam 2 and 3 scores (r = 0.20, p < 0.01 for both exams).

    Descriptive Statistics.

    Supplemental Table S4 presents the means and standard deviations for survey measures and exam scores for each intervention group. However, the raw differences did not account for important covariates related to students’ prior academic achievement, background, and engagement with booster assignments to support application of the workshop content. The study’s main analyses, described in the following section, tested for statistical significance in pre/post changes while accounting for demographic and academic covariates.

    Differences between Metacognition and Metacognition+TM Student Outcomes

    We conducted structural equation modeling to examine the main effects of the workshop type on students’ academic belief, strategy, and achievement outcomes. Findings indicated positive effects for the Metacognitive+TM workshop in terms of students’ degree commitments and exam grades. As students were administered the same items at both pre and post intervention, it was substantively reasonable that these responses would be associated with each other (i.e., the unexplained variances at both time points would be correlated). Therefore, based on Sörbom’s (1989) recommendations to improve model fit, we allowed for error covariances in conformity. Consequently, error covariances were allowed between the same items at pre and post intervention. After determining that model fit was acceptable, we examined the significance of treatment effect. These aspects are described in turn below.

    Based on goodness-of-fit indices for the SEM analyses, nearly all factors yielded numerical index values that exhibited acceptable model fit, as seen in Table 3. Model performance was evaluated on the basis of a variety of goodness-of-fit indices: χ2, root-mean-square error of estimation (RMSEA), comparative fit index (CFI), and standardized square-root mean residual (SRMR). An acceptable model fit is indicated by RSMEA values <0.08, SRMR values <0.10, and CFI values between 0.90 and 0.95 (Hu and Bentler, 1999; Schermelleh-Engel and Moosbrugger, 2003). Generally, we found that all models besides self-efficacy for self-regulated learning and intentional time use, which were not interpreted in subsequent analyses, demonstrated acceptable model fit based on these guidelines.

    TABLE 3. Goodness of fit for separate SEMs based on χ2, RMSEA, CFI, and SRMRa

    Measureχ2 (df)RMSEACFISRMR
    Self-efficacy for self-regulated learning978.68 (550)0.0630.8560.074
    Institutional commitment348.37 (213)0.0570.9030.073
    Degree commitment277.222 (172)0.0560.8990.075
    Motivational regulation626.12 (422)0.0490.9090.071
    Planning258.56 (172)0.0500.9360.071
    Monitoring and evaluating445.17 (310)0.0470.9280.07
    Postponement559.62 (364)0.0520.9290.075
    Missed deadlines255.943 (134)0.0680.9150.084
    Intentional time use1319.492 (854)0.0520.8420.072
    Time management tools309.20 (175)0.0630.9140.083
    Exam 2158.076 (89)0.0630.9440.078
    Exam 3165.755 (89)0.0660.9370.079

    aAll models besides self-efficacy for self-regulated learning and intentional time use, which were not interpreted in subsequent analyses, demonstrated acceptable model fit based on guidelines.

    For the effect of treatment on each of the outcomes, SEM results provided only some evidence that the Metacognition+TM workshop improved students’ academic beliefs or strategies (Table 4). Given that the primary interest is in the intervention effect, all results are presented as partially standardized estimates (using the STDY command in Mplus), which keeps the original unit for the predictor variables (e.g., 0 for Metacognition and 1 for Metacognition+TM) while standardizing the outcome variable. These estimates can be interpreted similarly to effect sizes (Kelley and Preacher, 2012).

    TABLE 4. Partially standardized estimates from analyses examining main effect of treatmenta

    Self-EffInstDegreeMotivPlanMon & EvPostponeMissedIntentToolsExam 2Exam 3
    Pre intervention0.58***0.87***0.42**0.75***0.60***0.54***0.75***0.65***0.73***0.68***
    Treatment0.180.150.36**0.050.070.170.04−0.170.090.040.22**0.28**
    Exam 10.000.000.000.000.000.01−0.01*−0.010.00−0.01*0.04***0.04***
    ACT−0.010.010.03−0.01−0.03−0.020.05**0.03−0.010.020.05***0.03*
    FGEN0.14−0.03−0.44*0.180.110.12−0.22−0.050.240.02−0.27**−0.02
    MIN−0.170.160.70***−0.15−0.13−0.21−0.07−0.050.16−0.150.150.24
    GPA0.52***0.42*0.44*0.240.34*0.140.00−0.280.38**0.43**0.100.48***
    Gender0.07−0.030.020.24*0.010.040.100.170.13−0.090.30***0.05
    Booster0.020.010.060.050.040.060.02−0.010.010.010.05*0.01
    Incremental0.09−0.06−0.12−0.050.000.18−0.08−0.11−0.120.090.02−0.06
    Entity0.000.110.180.070.070.010.110.120.100.020.040.11

    Self-Eff, self-efficacy for self-regulated learning; Inst, institutional commitment; Degree, degree commitment; Motiv, motivational regulation; Plan, planning; Mon & Ev, monitoring and evaluating; Postpone, postponement; Missed, missed deadlines; Intent, intentional time use; Tools, time management tools; FGEN, first-generation status (0 = continuing-generation student, 1 = first-generation student); MIN, students from minoritized groups (0 = majority, 1 = minoritized); Gender was coded 0 = female, 1 = male; GPA, cumulative university GPA at beginning of semester; Booster, credit earned for submission of optional booster assignments (planning and reflection worksheets).

    *p < 0.05.

    **p < 0.01.

    ***p < 0.001.

    The most notable shift for academic beliefs occurred in terms of degree commitment. When controlling for students’ pre-intervention beliefs about earning a degree and other covariates, Metacognition+TM students rated their commitment to earning a degree 0.36 SDs higher than Metacognition students on the post-intervention survey (p = 0.006). Students who participated in the Metacognition workshop remained fairly stable in their degree commitment, whereas those in the Metacognition+TM workshop became more committed to obtaining a degree. This outcome was statistically significant even after applying the Benjamini-Hochberg correction. There were no statistically significant differences between the Metacognition and Metacognition+TM groups on the other academic beliefs or strategies. We did, however, observe a significant and positive relationship of minoritized group status to degree commitment. When we controlled for the intervention they participated in as well as pre-intervention degree commitment scores and other covariates, students from minoritized groups were predicted to rate their commitment to earning a degree 0.70 SDs higher on the post-intervention survey than majority status students (p = <0.001).

    Differences between the two workshop interventions were apparent in terms of academic achievement. Students who participated in the Metacognition+TM workshop improved significantly on their biology test scores for both exam 2 and exam 3 compared with the Metacognition group. For students similar on pre-intervention scores and other covariates, a student who participated in the Metacognition+TM workshop was predicted to score 0.22 SDs higher on exam 2 (p = 0.006) and 0.28 SDs higher on exam 3 (p = 0.001) compared with a student who only attended the Metacognition workshop.

    Differences in Outcomes for Minoritized and Majority Students

    As the engagement of students from historically underserved racial and ethnic backgrounds is a salient concern for educators and researchers (Museus et al., 2011; Estrada et al., 2016), we investigated minoritized group status as a potential moderator of the workshop treatment effect on all of our academic belief, strategy, and achievement outcomes. This interaction quantified the differential effect of treatment on the academic outcomes, depending on whether the student was a member of a minoritized group. These moderation models were conducted similarly to the previous analyses, but with the addition of an interaction term consisting of the treatment and group status interaction. The Benjamini-Hochberg correction was once again used to control for the type I error rate. A summary of these results can be found in Table 5.

    TABLE 5. Partially standardized estimates from analyses examining interaction between treatment and MIN status (n = 198)a

    Self-EffInstDegreeMotivPlanMon & EvPostponeMissedIntentToolsExam 2Exam 3
    Pre intervention0.58***0.68***0.42***0.76***0.60***0.54***0.75***0.66***0.73***0.69***
    Treatment0.160.170.35*0.030.060.120.02−0.190.07−0.100.27**0.34***
    Exam 10.000.000.000.000.000.01−0.01*−0.010.00−0.010.04***0.04***
    ACT−0.010.000.03−0.01−0.03−0.020.05*0.03−0.010.020.05***0.03*
    FGEN0.13−0.12−0.44*0.180.110.10−0.23−0.060.23−0.04−0.26**0.00
    MIN−0.25−0.070.64*−0.24−0.16−0.36−0.16−0.080 .10−0.64*0.32*0.44**
    GPA0.52**0.54**0.44*0.240.35*0.140.00−0.280.38**0.45**0.100.47***
    Gender0.06−0.030.150.230.010.020.090.170.13−0.130.31***0.07
    Booster0.020.010.060.050.040.060.01−0.010.010.010.05*0.02
    Incremental0.10−0.04−0.12−0.050.000.17−0.08−0.11−0.120.090.02−0.06
    Entity0.000.130.180.070.070.010.110.120.100.020.040.11
    MIN x Treatment0.160.240.120.200.090.340.190.090.131.11**−0.38−0.45

    aSelf-Eff, self-efficacy for self-regulated learning; Inst, institutional commitment; Degree, degree commitment; Motiv, motivational regulation; Plan, planning; Mon & Ev, monitoring and evaluating; Postpone, postponement; Missed, missed deadlines; Intent, intentional time use; Tools, time management tools; FGEN, first-generation status (0 = continuing-generation student, 1 = first-generation student); MIN, students from minoritized groups (0 = majority, 1 = minoritized); Gender was coded 0 = female, 1 = male; GPA, cumulative university GPA at beginning of semester; Booster, credit earned for submission of optional booster assignments (planning and reflection worksheets).

    *p < 0.05.

    **p < 0.01.

    ***p < 0.001.

    We observed a significant treatment by minoritized group status interaction (p = 0.002) on self-reported use of time management tools measured at the end of the semester when accounting for pre-intervention time management and other covariates (Figure 1 and Supplemental Figure S2). When considering majority students, there was little difference between the Metacognition and Metacognition+TM interventions (0.10 SDs). When considering students from minoritized groups, however, we observed a much larger difference between the Metacognition and Metacognition+TM students (1.01 SDs). Specifically, there was a slight increase in self-reported use of time management tools from pre intervention to post intervention for the majority group status students across both the Metacognition workshop (Mpre-intervention = 3.78 [SD = 1.08], Mpost-intervention = 3.89 [SD = 0.95]) and the Metacognition+TM workshop (Mpre-intervention = 3.84 [SD = 0.94], Mpost-intervention = 3.91 [SD = 0.83]). There was a decrease in self-reported use of time management tools for students from minoritized groups who participated in the Metacognition workshop alone (Mpre-intervention = 3.67 [SD = 1.06], Mpost-intervention = 3.40 [SD = 1.10]), whereas there was an increase for students from minoritized groups who participated in the Metacognition+TM workshop (Mpre-intervention = 3.65 [SD = 0.80], Mpost-intervention = 3.97 [SD = 0.95]). Model fit for this interaction model was adequate; χ2 (185) = 313.835, RMSEA = 0.059, CFI = 0.919, SRMR = 0.082. We did observe a significant positive relationship between minoritized group status and degree commitment in the main effects analysis; however, there was not a significant interaction between minoritized group status and workshop type or any of the other academic belief, strategy, or achievement outcomes.

    FIGURE 1.

    FIGURE 1. Race/ethnicity by treatment interaction on time management tools. Compared with students from minoritized groups in the Metacognition-only intervention, students from minoritized groups in the Metacognition+TM intervention had greater self-reported use of time management tools at the end of the semester when accounting for pre-intervention time management and other covariates.

    DISCUSSION

    Introductory course work provides a context in which students can develop not only knowledge of core subject matter but also adaptive academic beliefs and strategies (Tinto, 2017). Still, too many students who complete introductory science courses fail to acquire foundational knowledge or the skills that support academic success and persistence. In response, researchers and practitioners alike have worked to design and administer academic interventions that equitably enhance students’ performance in STEM (White et al., 2008; Findley-Van Nostrand and Pollenz, 2017). Building on this work, our primary goal was to compare the effectiveness of two workshops designed to support students’ learning and achievement in introductory biology. Recognizing the importance of expanding educational opportunities for students typically underrepresented or systematically excluded from STEM fields due to race or ethnicity (Chang et al., 2014), we also examined whether the two workshop interventions impacted students from minoritized and majority groups in similar or distinct ways.

    Our results suggest three main conclusions. One, students who participated in the Metacognition+TM workshop reported greater commitment to earning a college degree, but overall the two workshops were not associated with a pervasive pattern of differences in students’ later motivation and strategic engagement. Two, students’ participation in the Metacognition+TM workshop was linked to greater performance on both subsequent course exams compared with those who participated in only the Metacognition workshop. Three, there was some indication that students from minoritized groups may have benefited to a greater extent from the workshop that included material designed to improve their time management and reduce their procrastination. In the following sections, we discuss these three major findings, identify avenues for additional research, and consider implications for practice.

    Workshop Content and Students’ Motivational Beliefs and Strategy Use

    With just one exception (i.e., degree commitment), students who completed the Metacognition and the Metacognition+TM workshops tended to report similar motivational beliefs and strategic behaviors at the end of the semester when accounting for a rigorous set of covariates. Because the covariates included the analogous construct assessed just before the workshop along with demographic and academic variables, this pattern of findings suggests that the two groups evidenced similar amounts of change in their motivational beliefs and strategic behaviors. A notable exception to the overall pattern of similarities between the two groups concerned students’ self-reported degree commitment, or their beliefs about whether they would persevere and earn a college degree. Accounting for covariates that included their initial degree commitment, students in the Metacognition+TM group tended to express a stronger intention or greater commitment to earning a college degree at the end of the semester when compared with those in the Metacognition group.

    Students’ commitment to their graduation goals is a key predictor of persistence (Allen et al., 2008). Moreover, prior research has found that degree commitment is strongly associated with motivation toward academic goals (Robbins et al., 2004). Though most of the academic beliefs and strategies assessed on the surveys did not change notably as a result of either intervention, it may be that degree commitment captured some other aspect that is triggered by learning about productive and effective learning. The link between the Metacognition+TM workshop and improved degree commitment is intriguing and should be explored further in future research. In terms of practical implications, Kuh and colleagues found in their review of the college student success literature that “the extent to which a campus has an enacted mission that makes an explicit commitment to the success of all students appears to be related to graduation rates, persistence, and student engagement” (Kuh et al., 2006, p. 55). Although it is just one possible approach, facilitating workshops about strategies for effective learning and time management is a concrete example of enacting the mission of student success.

    The question arises as to why the additional content designed to improve students’ time management and reduce their procrastination would have produced an effect on students’ degree commitment. Sense of self-efficacy and agency have well-established and likely reciprocal ties to metacognition (Louis et al., 2011) and time management (Wolters and Brady, 2020). Behavioral change theory suggests that students who believe they are capable of learning biology and view themselves as in control of their learning behaviors are more likely to try new strategies to exert agency over their learning and achieve desired learning outcomes (Dye and Stanton, 2017). Because time management skills help students to engage in behavioral regulation, these skills may provide a concrete way of allocating time toward other self-regulatory efforts (Kim et al., 2020), such as metacognitive regulation. Combined with a foundation of metacognition, the additional curriculum on time management may have further enhanced students’ sense of confidence and agency in their ability to reach long-term goals (Svinicki, 2010). That is, learning about how to learn paired with learning how to use time effectively and to avoid self-sabotaging behaviors (i.e., procrastination) may have provided students with an increased sense that they were capable of accomplishing their goals, and this increased confidence promoted a greater sense of commitment to academics. In line with this view, researchers have found that self-efficacy and sense of control are associated positively with degree commitment (Robbins et al., 2004; Dewitz et al., 2009).

    Although the broader conclusion seems to be that the two workshops did not have substantially different impacts on students’ other academic beliefs and behaviors, it must be understood in light of two important caveats. First, our a priori belief that both workshops would benefit students prevented us from including a comparison group of students who did not complete either workshop. As a result, it is unclear whether both workshops benefited students in ways that would not have occurred otherwise. It is conceivable that, without the workshop, students in both groups may have exhibited more maladaptive change during the semester, as opposed to the stability or slight increases seen in the raw scores. In fact, there is some reason to suspect that college students may show declines in their motivation and use of effective learning strategies during a typical semester. Prior research has suggested that, without intervention, college students’ academic beliefs and strategies may actually decrease over the course of a semester (Zusho et al., 2003). For example, DiBenedetto and Bembenutty (2013) found that, for students in an intermediate biology course, self-reported self-efficacy, homework self-regulation, and use of help-seeking strategies decreased between the beginning and end of the semester. Thus, the relative consistency of academic beliefs and strategies for students in the present study may be interpreted as demonstrating that the content within both workshops has a potential buffering effect against this decline. To test this possibility, future studies need to incorporate a design that allows additional comparisons between students who receive specific supplemental material and those who do not.

    The second caveat is that the similarities between the effects of the two workshops may have resulted from the timing of the post-intervention survey. Any greater use of time management strategies or reduced procrastination by the Metacognition+TM students after exposure to the workshop materials may have dissipated by the time students completed the second survey at the end of the semester. One reason for this diminishing effect is that, in the 10-week span between the workshop and the second survey, students may have returned to their old habits. Future research could administer surveys at multiple points during the semester to assess change over time.

    Although the booster assignments were intended to promote ongoing use of workshop content through regular planning and reflection on study strategies, they were graded based on completion rather than the specific content students included. Still, the correlations of booster assignment completion with exam grades suggest that the assignments were related to students’ academic outcomes and could be considered a valuable component of the interventions. Indeed, prior research suggests that planning, self-monitoring, and reflecting on study strategies support academic engagement and reduce procrastination (Scheithauer and Kelley, 2017; Hensley and Munn, 2020). While we would suggest that booster assignments might be an effective way to promote study strategies in college biology, future research is needed to understand their role in learning. This work could include conducting content analyses of students’ responses on the booster assignments to identify the nuance of students’ study strategies and alignment with intervention content.

    Workshop Content and Students’ Course Achievement

    Our results indicate that students who participated in the Metacognition+TM workshop tended to earn higher scores on each of the two subsequent course exams. Accounting for all covariates, including the exam 1 score, the SEM indicated that students who completed the Metacognition+TM workshop would have exam 2 and exam 3 scores that were 2.9 and 4.2 percentage points higher, respectively, than students who had completed the Metacognition workshop. Although perhaps not an overwhelming advantage, these findings do suggest that the supplemental curriculum covered by the Metacognition+TM workshop provided an additional benefit to students’ later learning of course material. These changes can be interpreted in a few different ways.

    On the one hand, the group differences in exam performance are consistent with studies linking increased time management and decreased procrastination with improved academic functioning (Landrum et al., 2006; Steel, 2007; Credé and Kuncel, 2008; Credé and Phillips, 2011; Krumrei-Mancuso et al., 2013; Basila, 2014). From this perspective, the Metacognition+TM curriculum may have served to improve students’ time management and reduce procrastination in a way that increased their learning of course material, studying, and subsequent exam performance. On the other hand, we did not find clear evidence that the Metacognition+TM students’ use of time management strategies increased or that their levels of procrastination decreased any more or less than for the Metacognition students.

    Still, there are some possibilities to explore before dismissing improvements in skills as a possible causal pathway for improved exam performance. First, it could be that the Metacognition+TM workshop improved students’ engagement, time management, and procrastination in ways not directly assessed with items on the survey. For example, students may have used time more productively over the course of a given day or become more accurate in estimating the amount of time needed to prepare for an exam. Future studies that involve behavioral indicators—such as time-tracking diaries (Nonis et al., 2006)—could provide further insight into how students use their time before and after an intervention. Second, it is possible that the workshop provided benefits to students that manifested in better learning and exam performance in the intervening weeks but, in terms of measurably different behaviors, diminished by the time the second survey was administered. Additional research is needed to confirm the impact of the Metacognition+TM workshop on later academic performance and to uncover the most viable reason for the apparent positive association with later achievement.

    A number of prior studies have demonstrated the value of teaching students how to learn (Cook et al., 2013; Zhao et al., 2014; Stanton et al., 2015; van Vliet et al., 2015; Dye and Stanton, 2017; Sabel et al., 2017; Sebesta and Speth, 2017). The improvements in exam scores for students who participated in the Metacognition+TM workshop provide additional evidence for the value of teaching biology students how to take a self-aware and planful approach to learning. Our study suggests a specific way to implement this: providing workshop content during lab time soon after an exam, in partnership with other university resources and with the support of booster assignments. Instructors are experts in their subject matter but may not always have specialized knowledge in topics such as managing a college schedule or addressing the psychological roots of procrastination. Although we recommend that instructors deepen their knowledge of these topics, it can be valuable to work with the university learning center as a partner in student success (McGuire, 2015). This approach can be beneficial, because “leveraging the assets of competent staff from learning centers to help faculty members integrate these best practices conserves scarce institutional resources” (Arendale, 2010, p. 12).

    There are other aspects of the design that may require further research before the direct implications for instruction are known. In our study, the instructor presented content about metacognition, and the learning center representative presented content about using time management. Having the course instructor present workshop content was likely effective, because it leveraged an existing relationship (Pianta et al., 2012), but bringing in an outside expert may have had a different benefit: piquing students’ interest or curiosity (Rotgans and Schmidt, 2014). As positive academic emotions are known to facilitate learning (Tanaka and Murayama, 2014), a change in instructor may modify the impact of academic workshops based on student perceptions of trust and preferences for novelty. Although these potential impacts on student engagement were not measured in this study, it is worthwhile to consider them in future intervention work. Another aspect that was not accounted for directly was the differing length of the two interventions. We designed two interventions that would be authentic to the classroom context and could be implemented as is by other educators, as our purpose was to investigate the relative benefits of teaching about metacognition alone or teaching about metacognition plus time management. Adding an additional hour of metacognition content in order to have equivalent time on task would not be a classroom use that would exist outside the study. As a consequence of this design, however, we are unable to definitely eliminate time on task (i.e., time spent learning about learning) as a competing explanation for students’ outcomes. As instructors implement these efforts, we recommend ongoing research and assessment to better understand what design elements best support students’ success and complement the curriculum.

    Workshop Content and Outcomes for Students from Minoritized Groups

    Both minoritized and majority status students in science can experience struggles adjusting to the rigor and time demands of college-level learning (Yazedjian et al., 2008). As a systematic issue, however, students from minoritized groups are often disproportionately disadvantaged due to facing unjustified lowered expectations or having less access to early educational resources and college preparatory strategies (Chang et al., 2014; Tsui, 2016). In light of the shared interest among STEM educators and researchers to support minoritized students’ equitable access to resources and opportunities, we made a particular effort to understand whether Black/African-American and Hispanic/Latinx students responded to the two workshops in ways that were similar to their White or Asian classmates. In most respects, our findings indicate that minoritized and majority students were impacted by the workshops in ways that were quite similar. However, one main effect and one interaction effect suggest that participating in workshops about effective learning may be a valuable form of academic support for students from minoritized groups.

    When accounting for the rigorous set of covariates in our main analysis, there was one main effect associated with students’ minoritized group status: Across both workshop groups, students from minoritized groups had higher self-reported degree commitment at the end of the semester than their majority status classmates. This finding demonstrates the malleability of degree commitment. Scholars (e.g., Lewis, 2003; Harper, 2010) rightly criticize the notion that STEM achievement gaps or college-to-career pipeline leaks are due to deficits in minoritized students’ aptitude or attitudes. As scholars such as Yosso (2005) and Denton et al. (2020) address in their work on community cultural wealth, our study supports the idea that minoritized students’ commitment to earning a college degree can even exceed that of majority students and may be an important motivational asset that educators can further encourage. These commitments reflect aspirational capital, a form of resilience seen in students’ “ability to maintain hopes and dreams for the future, even in the face of real and perceived barriers” (Yosso, 2005, p. 77). Although additional research is needed to explore the connection, it seems likely that exposure to academic strategy workshops as part of their course experience may support the already strong academic commitments held by students from minoritized groups. Because we did not compare students who experienced either workshop with students who experienced no workshop, the changes in degree commitment for students from minoritized backgrounds cannot be attributed to the workshop experience alone. It is also possible that the growth in degree commitment for students from minoritized groups stemmed from a different aspect of their experience during the semester, such as peer mentoring or undergraduate research, which are known to support retention in STEM (Wilson et al., 2012), or networks in the community and at home (i.e., social capital) that encourage students to persevere toward a degree (Yosso, 2005).

    The interaction effect identified in the study also suggests the additional benefits of learning about managing time and overcoming procrastination. As shown in Figure 1, the impact of the workshops on the self-reported use of time management tools was different for minoritized and majority students. Students from minoritized groups who completed the Metacognition+TM workshop tended to report increased use of time management tools to a greater extent than students from minoritized groups in the Metacognition workshop. Thus, particularly for students who have historically been underrepresented in or excluded from STEM educational opportunities (Asai, 2020a,b), the additional workshop content seems to have an important impact on self-reported behaviors measured by the time management tools scale, such as creating reminders for oneself, making lists of daily tasks, and keeping track of deadlines using a calendar. To better understand the nature of changes in students’ time management tools beyond the content and limited time points of the self-report scales, it would be valuable to examine students’ booster assignments using qualitative or mixed-methods inquiry with rigorous coding processes. In the present study, however, the booster assignments were introduced following students’ participation in the workshops, and it was not possible to draw comparisons in booster assignment content before and after the intervention. We recommend future research designs that prioritize examining the details of students’ written plans and reflections to better understand ongoing changes in how students think about and implement various academic strategies.

    Prior literature suggests at least two possible reasons why the additional time management material may have been particularly impactful for students from minoritized groups. However, it is important to consider the following explanations as only speculative and not based on any specific characteristics observed among the students in our study. To honor student diversity, educators and researchers must consider concepts such as self-regulation in the broader context of students’ social, cultural, and historical perspectives and experiences, a process known as race-reimaging (Helms et al., 2005; Matthews and López, 2020). Beyond the initial exploration below, additional research is necessary to work against stereotypes and portray a fuller picture of how students from minoritized groups engage with academic strategies and contexts.

    One reason that the time management material could have an impact is if students did not previously have sufficient opportunity to learn or master these skills. This explanation might make sense if students were more likely to have been systematically excluded from earlier schooling environments that taught these strategies (Kendi, 2019). In a number of U.S. states, for example, schools with a high proportion of Black students have resource disparities with schools with a high proportion of White students, leading to disparities in learning opportunities (Kendi, 2019). Even in schools that offer college preparatory curriculum, students from minoritized backgrounds may be discouraged from these opportunities (Murphy and Zirkel, 2015). Thus, it is possible that the minoritized students in our study were not previously encouraged to learn about time management as a pathway to greater learning. In Yosso’s community cultural wealth framework, resistant capital includes the “knowledge and skills fostered through oppositional behavior that challenges inequality” (Yosso, 2005, p. 13). From this perspective, students’ actions to apply workshop content could be seen as a reflection of resistant capital in which they challenged misconceptions about what they could accomplish as students. Specifically, students may have enacted resistant capital by asserting themselves as worthy and capable students (Yosso, 2005), particularly after having experienced systems that may have limited opportunities or provided devaluing messages about their academic capabilities (Kendi, 2019).

    A second reason the time management material may have been particularly impactful would be if students experienced high demands or pressure on their time. For instance, if students were more likely to be employed, have family-based time demands, or be taking more courses, then learning about new effective time management strategies might have a greater positive impact on their academic functioning. Prior research suggests that students from minoritized groups tend to work full- or part-time jobs while in college to a greater extent than White or Asian students (Hurtado et al., 2010). In addition to the time demands of employment, college students from minoritized groups tend to devote more hours to family obligations than majority status students (Tovar and Simon, 2006). Students who experience these situations and related time pressures may be especially responsive to strategies that help them to bring added efficiency and structure to their schedules. As we did not collect information about participants’ time demands and the number of students from minoritized backgrounds was small, we cannot say with certainty whether these conditions explain the additional impact of the time management content.

    Additional research is needed to uncover which of these explanations (or others) might best explain why students from minoritized groups changed their time management strategies as a result of the Metacognition+TM workshop. In particular, researchers could evaluate the extent to which minoritized students in STEM face greater demands on their time, where these increased demands originate, and which specific time management strategies are most valuable in this context. Because minority status has historically been created due to exclusion, it is important for both research efforts and related instructional practices to approach time management from the perspective of creating equity and fostering inclusion (Asai, 2020a,b). Inclusive pedagogy concerns not only the subject matter but also the environment and strategies for learning (Kenyon, 2011). When working to support student success, it is essential that instructors convey the belief that all students are capable of learning and use classroom pedagogy to demonstrate an investment in students’ growth and ability to meet high expectations (Ramirez, 2020). Classroom workshops like those investigated in the present study provide one such pathway forward. Accompanying these efforts, future studies must investigate the cultural nuances of students’ academic development in the biology classroom to avoid oversimplifying or making assumptions based on racial or ethnic background (Matthews and López, 2020).

    Limitations

    In addition to those limitations already mentioned, findings from the present study should be considered within the context of a few additional limitations. All survey items focused on students’ general academic beliefs and strategies; the instructions did not require students to think only of biology class when responding. Because both interventions were embedded in the biology context and provided specific examples of the ways strategies could be applied to biology, the interventions might have had a greater impact on students’ biology beliefs and strategies than was assessed by the survey items. Moreover, details about students’ beliefs and strategies were subject to the limitations of self-report data. Due to social desirability bias, students may have chosen responses to present themselves in a positive light, rather than being self-critical (Bowman and Hill, 2011). The self-report data also may not have included the full range and complexity of students’ actual beliefs and strategies (Perry, 2002). Future research should aim to provide a more fine-grained assessment of discipline-specific learning through the use of methods that extend beyond traditional survey-based methods.

    It is important to recognize the possibility of multiple explanations for the academic changes reported by students from minoritized backgrounds, particularly in terms of time management and degree commitment. The number of total participants from minoritized backgrounds was small (n = 30), there was not an additional comparison group of students who did not participate in either workshop, and the analyses used students’ racial and ethnic backgrounds as a categorical variable. Due to these aspects, the investigation of race and ethnicity was unable to account for the diversity of experiences and perspectives salient to students from minoritized backgrounds (Matthews and López, 2020). This limitation underscores the need for further research that assesses the cultural, social, and historical aspects of motivation and learning (Schutz, 2020).

    CONCLUSION

    This research suggests that implementing a workshop intervention within the bounds of an academic course can be a useful practice, particularly in college courses that can be challenging for students. Addressing metacognitive strategies as well as other types of self-regulated learning strategies, such as time management, may benefit students. Gaining a better understanding of the specific types of interventions that are most effective, as well as how to enact them as part of inclusive pedagogy, is essential. Future research that considers novel approaches to design and measurement can provide additional clarity regarding how students’ beliefs, strategies, and achievement may develop as a result of learning the how and when of effective studying.

    ACKNOWLEDGMENTS

    This project was supported in part by funding from the Student Academic Success Research Grants Program and the Drake Institute for Teaching and Learning Research and Implementation Grants Program at the Ohio State University.

    REFERENCES

  • ACT. (2018). Guide to the 2018 ACT/SAT concordance. Retrieved August 15, 2020, from https://collegereadiness.collegeboard.org/pdf/guide-2018-act-sat-concordance.pdf Google Scholar
  • Aflalo, E. (2018). Students generating questions as a way of learning. Active Learning in Higher Education, 15(2), 157–171. https://doi.org/10.1177/1469787418769120 Google Scholar
  • Allen, J., Robbins, S. B., Casillas, A., & Oh, I.-S. (2008). Third-year college retention and transfer: Effects of academic performance, motivation, and social connectedness. Research in Higher Education, 49(7), 647–664. https://doi.org/10.1007/s11162-008-9098-3 Google Scholar
  • Arendale, D. R. (2010). Current challenges and controversies for learning assistance. ASHE Higher Education Report, 35(6), 1–145. https://doi.org/10.1002/aehe.3506 Google Scholar
  • Asai, D. J. (2020a). Excluded. Journal of Microbiology and Biology Education, 21(2), 1–2. Google Scholar
  • Asai, D. J. (2020b). Race matters. Cell, 181(4), 754–757. https://doi.org/10.1016/j.cell.2020.03.044 MedlineGoogle Scholar
  • Bandura, A. (2006). Guide for constructing self-efficacy scales. In Pajares, F.Urdan, T. (Eds.), Self-Efficacy Beliefs in Adolescents (pp. 307–337). Greenwich, CT: Information Age Publishing. Google Scholar
  • Basila, C. (2014). Good time management and motivation level predict student academic success in college on-line courses. International Journal of Cyber Behavior, Psychology and Learning, 4(3), 45–52. https://doi.org/10.4018/ijcbpl.2014070104 Google Scholar
  • Bembenutty, H. (2011). Academic delay of gratification and academic achievement. New Directions for Teaching and Learning, 126, 55–65. https://doi.org/10.1002/tl Google Scholar
  • Benjamini, Y., & Hochberg, Y. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society, 57(1), 289–300. Google Scholar
  • Bernacki, M. L., Vosicka, L., & Utz, J. C. (2020). Can a brief, digital skill training intervention help undergraduates “learn to learn” and improve their STEM achievement? Journal of Educational Psychology, 112(4), 765–781.  https://doi.org/10.1037/edu0000405.supp Google Scholar
  • Bernacki, M. L., Vosicka, L., Utz, J. C., & Warren, C. B. (2021). Effects of digital learning skill training on the academic performance of undergraduates in science and mathematics. Journal of Educational Psychology, 113, 1107–1125. https://doi.org/10.1037/edu0000485 Google Scholar
  • Bloom, B. S., Engelhard, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). In Taxonomy of educational objectives, Handbook I, Cognitive domain. David McKay. Google Scholar
  • Bodill, K., & Roberts, L. D. (2013). Implicit theories of intelligence and academic locus of control as predictors of studying behaviour. Learning and Individual Differences, 27, 163–166. https://doi.org/10.1016/j.lindif.2013.08.001 Google Scholar
  • Boretz, E. (2012). Midsemester academic interventions in a student-centered research university. Journal of College Reading and Learning, 42(2), 90–108. https://doi.org/10.1080/10790195.2012.10850356 Google Scholar
  • Bowman, N. A., & Hill, P. L. (2011). Measuring how college affects students: Social desirability and other potential biases in college student self-reported gains. New Directions for Institutional Research, 150, 73–85. https://doi.org/10.1002/ir Google Scholar
  • Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. Internet and Higher Education, 27, 1–13. https://doi.org/10.1016/j.iheduc.2015.04.007 Google Scholar
  • Brooks-Harris, J., & Stock-Ward, S. (1999). Workshops: Designing and facilitating experiential learning. Thousand Oaks, CA: Sage. Google Scholar
  • Bryne, B. M. (2006). Structural equation modeling with EQS: Basic concepts, applications, and programming (2nd ed.). Mahwah, NJ: Erlbaum. Google Scholar
  • Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2), eaau4734. https://doi.org/10.1126/sciadv.aau4734 MedlineGoogle Scholar
  • Chang, M. J., Sharkness, J., Hurtado, S., & Newman, C. B. (2014). What matters in college for retaining aspiring scientists and engineers from underrepresented racial groups. Journal of Research in Science Teaching, 51(5), 555–580. https://doi.org/10.1002/tea.21146 Google Scholar
  • Cholewa, B., & Ramaswami, S. (2015). The effects of counseling on the retention and academic performance of underprepared freshmen. Journal of College Student Retention, 17, 204–225. https://doi.org/10.1177/1521025115578233 Google Scholar
  • Claessens, B. J. C., van Eerde, W., Rutte, C. G., & Roe, R. A. (2007). A review of the time management literature. Personnel Review, 36(2), 255–276. https://doi.org/10.1108/00483480710726136 Google Scholar
  • Cogliano, M., Bernacki, M. L.. & Kardash, C. M. (2020). A metacognitive retrieval practice intervention to improve undergraduates’ monitoring and control processes and use of performance feedback for classroom learning. Journal of Educational Psychology, Advance online publication. https://doi.org/10.1037/edu0000624 Google Scholar
  • Collins, W., & Sims, B. C. (2006). Help seeking in higher education: Academic support services. In Karabenick, S. A.Newman, R. S. (Eds.), Help seeking in academic settings: Goals, groups, and contexts (pp. 203–223). Mahwah, NJ: Erlbaum. Google Scholar
  • Cook, E., Kennedy, E., & McGuire, S. Y. (2013). Effect of teaching metacognitive learning strategies on performance in general chemistry courses. Journal of Chemical Education, 90(8), 961–967. https://doi.org/10.1021/ed300686h Google Scholar
  • Credé, M., & Kuncel, N. R. (2008). Study habits, skills, and attitudes: The third pillar supporting collegiate academic performance. Perspectives on Psychological Science, 3(6), 425–453. https://doi.org/10.1111/j.1745-6924.2008.00089.x MedlineGoogle Scholar
  • Credé, M., & Phillips, L. A. (2011). A meta-analytic review of the Motivated Strategies for Learning Questionnaire. Learning and Individual Differences, 21(4), 337–346. https://doi.org/10.1016/j.lindif.2011.03.002 Google Scholar
  • Davidson, W. B., Beck, H. P., & Milligan, M. (2009). The College Persistence Questionnaire: Development and validation of an instrument that predicts student attrition. Journal of College Student Development, 50(4), 373–390. https://doi.org/10.1353/csd.0.0079 Google Scholar
  • Denton, M., Borrego, M., & Boklage, A. (2020). Community cultural wealth in science, technology, engineering, and mathematics education: A systematic review. Journal of Engineering Education, 109(3), 556–580. https://doi.org/10.1002/jee.20322 Google Scholar
  • Dewitte, S., & Schouwenburg, H. C. (2002). Procrastination, temptations, and incentives: The struggle between the present and the future in procrastinators and the punctual. European Journal of Personality, 16(6), 469–489. https://doi.org/10.1002/per.461 Google Scholar
  • Dewitz, S. J., Woolsey, M. L., & Walsh, W. B. (2009). College student retention: An exploration of the relationship between self-efficacy beliefs and purpose in life among college students. Journal of College Student Development, 50(1), 19–34. Google Scholar
  • DiBenedetto, M. K., & Bembenutty, H. (2013). Within the pipeline: Self-regulated learning, self-efficacy, and socialization among college students in science courses. Learning and Individual Differences, 23(1), 218–224. https://doi.org/10.1016/j.lindif.2012.09.015 Google Scholar
  • Dimmitt, C., & McCormick, C. B. (2012). Metacognition in education. In Harris, K. R.Graham, S.Urdan, T. C. (Eds.), APA educational psychology handbook, Vol 1, Theories, constructs, and critical issues (pp. 157–187). Washington, DC: American Psychological Association. https://doi.org/10.4324/9780203805503.ch22 Google Scholar
  • Donker, A. S., de Boer, H., Kostons, D., Dignath van Ewijk, C. C., & van der Werf, M. P. C. (2014). Effectiveness of learning strategy instruction on academic performance: A meta-analysis. Educational Research Review, 11, 1–26. https://doi.org/10.1016/j.edurev.2013.11.002 Google Scholar
  • Dweck, C. S. (1999). Self-theories: Their role in motivation, personality, and development. Philadelpha, PA: Taylor & Francis. Google Scholar
  • Dye, K. M., & Stanton, J. D. (2017). Metacognition in upper-division biology students: Awareness does not always lead to control. CBE—Life Sciences Education, 16(2), 1–14. https://doi.org/10.1187/cbe.16-09-0286 Google Scholar
  • Estrada, M., Burnett, M., Campbell, A. G., Campbell, P. B., Denetclaw, W. F., Gutiérrez, C. G., ... & Zavala, M. E. (2016). Improving underrepresented minority student persistence in STEM. CBE—Life Sciences Education, 15(3), 1–10. https://doi.org/10.1187/cbe.16-01-0038 Google Scholar
  • Findley-Van Nostrand, D., & Pollenz, R. S. (2017). Evaluating psychosocial mechanisms underlying STEM persistence in undergraduates: Evidence of impact from a six-day pre-college engagement STEM academy program. CBE—Life Sciences Education, 16(2), 1–15. https://doi.org/10.1187/cbe.16-10-0294 Google Scholar
  • Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive-developmental inquiry. American Psychologist, 34(10), 906–911. https://doi.org/https://doi.org/10.1037/0003-066X.34.10.906 Google Scholar
  • Freeman, S., Haak, D., & Wenderoth, M. P. (2011). Increased course structure improves performance in introductory biology. CBE—Life Sciences Education, 10(2), 175–186. https://doi.org/10.1187/cbe.10-08-0105 LinkGoogle Scholar
  • Freeman, S., O’Connor, E., Parks, J. W., Cunningham, M., Hurley, D., Haak, D., ... & Wenderoth, M. P. (2007). Prescribed active learning increases performance in introductory biology. CBE—Life Sciences Education, 6(2), 132–139. https://doi.org/10.1187/cbe.06-09-0194 LinkGoogle Scholar
  • Gerhardt, M. W., & Brown, K. G. (2006). Individual differences in self-efficacy development: The effects of goal orientation and affectivity. Learning and Individual Differences, 16(1), 43–59. https://doi.org/10.1016/j.lindif.2005.06.006 Google Scholar
  • Häfner, A., Oberst, V., & Stock, A. (2014). Avoiding procrastination through time management: An experimental intervention study. Educational Studies, 40(3), 352–360. https://doi.org/10.1080/03055698.2014.899487 Google Scholar
  • Häfner, A., Stock, A., & Oberst, V. (2015). Decreasing students’ stress through time management training: An intervention study. European Journal of Psychology of Education, 30, 81–94. https://doi.org/https://psycnet.apa.org/doi/10.1007/s10212-014-0229-2 Google Scholar
  • Harper, S. R. (2010). An anti-deficit achievement framework for research on students of color in STEM. New Directions for Institutional Research, 148, 63–74. https://doi.org/10.1002/ir.362 Google Scholar
  • Harrington, N. (2005). It’s too difficult! Frustration intolerance beliefs and procrastination. Personality and Individual Differences, 39(5), 873–883. https://doi.org/10.1016/j.paid.2004.12.018 Google Scholar
  • Hartwig, M. K., & Dunlosky, J. (2012). Study strategies of college students: Are self-testing and scheduling related to achievement? Psychonomic Bulletin & Review, 19(1), 126–134. https://doi.org/10.3758/s13423-011-0181-y MedlineGoogle Scholar
  • Hattie, J., Biggs, J., & Purdie, N. (1996). Effects of learning skills interventions on student learning: A meta-analysis. Educational Research, 66(2), 99–136. Google Scholar
  • Hattie, J., & Donoghue, G. M. (2016). Learning strategies: A synthesis and conceptual model. Science of Learning, 1, 1–13. https://doi.org/10.1038/npjscilearn.2016.13 Google Scholar
  • Helms, J. E., Jernigan, M., & Mascher, J. (2005). The meaning of race in psychology and how to change it: A methodological perspective. American Psychologist, 60(1), 27–36. https://doi.org/10.1037/0003-066X.60.1.27 MedlineGoogle Scholar
  • Hensley, L. C. (2014). Reconsidering active procrastination: Relations to motivation and achievement in college anatomy. Learning and Individual Differences, 36, 157–164. https://doi.org/10.1016/j.lindif.2014.10.012 Google Scholar
  • Hensley, L. C., & Munn, K. (2020). The power of writing about procrastination: Journaling as a tool for change. Journal of Further and Higher Education, 44(10), 1450–1465. https://doi.org/10.1080/0309877X.2019.1702154 Google Scholar
  • Hensley, L. C., Shaulskiy, S., Zircher, A., & Sanders, M. (2015). Overcoming barriers to engaging in college academics. Journal of Student Affairs Research and Practice, 52(2), 176–189. https://doi.org/10.1080/19496591.2015.1020246 Google Scholar
  • Hoffmann, R., & McGuire, S. Y. (2010). Learning and teaching strategies. American Scientist, 98(5), 378–382. https://doi.org/10.4324/9781315171944-6 Google Scholar
  • Hoops, L. D., & Kutrybala, L. (2015). The impact of a summer bridge program on nontraditional student development: Teacher care matters. Community College Journal of Research and Practice, 39(11), 1039–1051. https://doi.org/10.1080/10668926.2014.933456 Google Scholar
  • Hopkins, R. F., Lyle, K. B., Hieb, J. L., & Ralston, P. A. S. (2016). Spaced retrieval practice increases college students’ short- and long-term retention of mathematics knowledge. Educational Psychology Review, 28(4), 853–873. https://doi.org/10.1007/s10648-015-9349-8 Google Scholar
  • Horn, J. L., & McArdle, J. J. (1992). A practical and theoretical guide to measurement invariance in aging research. Experimental Aging Research, 18(3–4), 117–144. https://doi.org/10.1080/03610739208253916 MedlineGoogle Scholar
  • Howell, A. J., & Buro, K. (2009). Implicit beliefs, achievement goals, and procrastination: A mediational analysis. Learning and Individual Differences, 19(1), 151–154. https://doi.org/10.1016/j.lindif.2008.08.006 Google Scholar
  • Hu, L.-T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55. https://doi.org/https://doi.org/10.1080/10705519909540118 Google Scholar
  • Hurtado, S., Newman, C. B., Tran, M. C., & Chang, M. J. (2010). Improving the rate of success for underrepresented racial minorities in STEM fields: Insights from a national project. New Directions for Institutional Research, 148, 5–15. https://doi.org/10.1002/ir.357 Google Scholar
  • Jackson, T., Weiss, K. E., Lundquist, J. J., & Hooper, D. (2003). The impact of hope, procrastination, and social activity on academic performance of midwestern college students. Education, 124(2), 310–320. Google Scholar
  • Jordt, H., Eddy, S. L., Brazil, R., Lau, I., Mann, C., Brownell, S. E., ... & Freeman, S. (2017). Values affirmation intervention reduces achievement gap between underrepresented minority and white students in introductory biology classes. CBE—Life Sciences Education, 16(3), 1–10. https://doi.org/10.1187/cbe.16-12-0351 Google Scholar
  • Kelley, K., & Preacher, K. J. (2012). On effect size. Psychological Methods, 17(2), 137–152. https://doi.org/10.1037/a0028086 MedlineGoogle Scholar
  • Kendi, I. X. (2019). How to be an antiracist. New York: Penguin Random House. Google Scholar
  • Kenyon, M. A. (2011). Employing universal design for instruction. New Directions for Student Services, 134, 21–33. https://doi.org/https://doi.org/10.1002/ss.392 Google Scholar
  • Kim, Y., Brady, A. C., & Wolters, C. A. (2018). Development and validation of the brief regulation of motivation scale. Learning and Individual Differences, 67, 259–265. https://doi.org/10.1016/j.lindif.2017.12.010 Google Scholar
  • Kim, Y., Brady, A. C., & Wolters, C. A. (2020). College students’ regulation of cognition, motivation, behavior, and context: Distinct or overlapping processes? Learning and Individual Differences, 80, 1–8. https://doi.org/10.1016/j.lindif.2020.101872 Google Scholar
  • Klassen, R. M., Krawchuk, L. L., & Rajani, S. (2008). Academic procrastination of undergraduates: Low self-efficacy to self-regulate predicts higher levels of procrastination. Contemporary Educational Psychology, 33(4), 915–931. https://doi.org/10.1016/j.cedpsych.2007.07.001 Google Scholar
  • Klingsieck, K. B., Grund, A., Schmid, S., & Fries, S. (2013). Why students procrastinate: A qualitative approach. Journal of College Student Development, 54, 397–412. https://doi.org/10.1353/csd.2013.0060 Google Scholar
  • Knekta, E., Runyon, C., & Eddy, S. (2019). One size doesn’t fit all: Using factor analysis to gather validity evidence when using surveys in your research. CBE—Life Sciences Education, 18(1), 1–17. https://doi.org/10.1187/cbe.18-04-0064 Google Scholar
  • Krumrei-Mancuso, E. J., Newton, F. B., Kim, E., & Wilcox, D. (2013). Psychosocial factors predicting first-year college student success. Journal of College Student Development, 54(3), 247–266. https://doi.org/10.1353/csd.2013.0034 Google Scholar
  • Kuh, G. D., Kinzie, J., Buckley, J. A., Bridges, B. K., & Hayek, J. C. (2006). What matters to student success: A review of the literature (Commissioned Report for the National Symposium on Postsecondary Student Success: Spearheading a Dialog on Student Success). Retrieved September 15, 2020, from https://nces.ed.gov/npec/pdf/kuh_team_report.pdf Google Scholar
  • Kuh, G. D., Kinzie, J., Buckley, J. A., Bridges, B. K., & Hayek, J. C. (2007). Piecing together the student success puzzle: Research, propositions, and recommendations. ASHE Higher Education Report, 32(5), 1–182. https://doi.org/10.1002/aehe.3205 Google Scholar
  • Landrum, R. E., Turrisi, R., & Brandel, J. M. (2006). College students’ study time: Course level, time of semester, and grade earned. Psychological Reports, 98(3), 675–682. MedlineGoogle Scholar
  • Larmar, S., & Lodge, J. M. (2014). Making sense of how I learn: Metacognitive capital and the first year university student. International Journal of the First Year in Higher Education, 5(1), 93–105. https://doi.org/10.5204/intjfyhe.v5i1.193 Google Scholar
  • Lazowski, R. A., & Hulleman, C. S. (2016). Motivation interventions in education: A meta-analytic review. Review of Educational Research, 86(2), 602–640. https://doi.org/10.3102/0034654315617832 Google Scholar
  • Lent, R. W., Sheu, H. Bin, Miller, M. J., Cusick, M. E., Penn, L. T., & Truong, N. N. (2018). Predictors of science, technology, engineering, and mathematics choice options: A meta-analytic path analysis of the social-cognitive choice model by gender and race/ethnicity. Journal of Counseling Psychology, 65(1), 17–35. https://doi.org/10.1037/cou0000243 MedlineGoogle Scholar
  • Lewis, B. F. (2003). A critique of literature on the underrepresentation of African Americans in science: Directions for future research. Journal of Women and Minorities in Science and Engineering, 9(3–4), 361–373. https://doi.org/10.1615/jwomenminorscieneng.v9.i34.100 Google Scholar
  • Lou, N. M., Masuda, T., & Li, L. M. W. (2017). Decremental mindsets and prevention-focused motivation: An extended framework of implicit theories of intelligence. Learning and Individual Differences, 59, 96–106. https://doi.org/10.1016/j.lindif.2017.08.007 Google Scholar
  • Louis, K. S., Anderson, M. S., & Rosenberg, L. (2011). Interactions of metacognition with motivation and affect in self-regulated learning: The MASRL model. Educational Psychologist, 46(1), 6–25. Google Scholar
  • Macan, T. (1994). Time management: Test of a process model. Journal of Applied Psychology, 79, 381–391. https://doi.org/10.1037/0021-9010.79.3.381 Google Scholar
  • Matthews, J. S., & López, F. (2020). Race-reimaging educational psychology research: Investigating constructs through the lens of race and culture. Contemporary Educational Psychology, 61, 1–7, Advance online publication. https://doi.org/10.1016/j.cedpsych.2020.101878 Google Scholar
  • May, G. S., & Chubin, D. E. (2003). A retrospective on undergraduate engineering success for underrepresented and first-year students. Journal of Engineering Education, 92, 27–39. https://doi.org/10.18260/1-2–31987 Google Scholar
  • McCarthy, M., & Kuh, G. D. (2006). Are students ready for college? What student engagement data say. Phi Delta Kappan, 87, 664–669. Google Scholar
  • McDaniel, M. A., & Einstein, G. O. (2020). Training learning strategies to promote self-regulation and transfer: The knowledge, belief, commitment, and planning framework. Perspectives on Psychological Science, 15(6), 1363–1381. https://doi.org/10.1177/1745691620920723 MedlineGoogle Scholar
  • McGuire, S. Y. (2015). Teach students how to learn: Strategies you can incorporate into any course to improve student metacognition, study skills, and motivation. Sterling, VA: Stylus. Google Scholar
  • Moshman, D. (2018). Metacognitive theories revisited. Educational Psychology Review, 30(2), 599–606. https://doi.org/10.1007/s10648-017-9413-7 Google Scholar
  • Murphy, M. C., & Zirkel, S. (2015). Race and belonging in school: How anticipated and experienced belonging affect choice, persistence, and performance. Teachers College Record, 117(12), 1–40. Google Scholar
  • Museus, S. D., Palmer, R. T., Davis, R. J., & Maramba, D. C. (2011). Racial and ethnic minority students’ success in STEM education. ASHE Higher Education Report, 36(6), 1–140. Google Scholar
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academies Press. Google Scholar
  • National Science Foundation. (2019). Women, minorities, and persons with disabilities in science and engineering (Special report NSF 19-304). Retrieved July 8, 2021, from https://ncses.nsf.gov/pubs/nsf19304/digest Google Scholar
  • Netemeyer, R. G., Bearden, W. O., & Sharma, S. (2003). Scaling procedures: Issues and applications. Thousand Oaks, CA: Sage. Google Scholar
  • Niemiec, C. P., & Ryan, R. M. (2009). Autonomy, competence, and relatedness in the classroom: Applying self-determination theory to educational practice. Theory and Research in Education, 7, 133–144. https://doi.org/10.1177/1477878509104318 Google Scholar
  • Nonis, S. A., Philhours, M. J., & Hudson, G. I. (2006). Where does the time go? A diary approach to business and marketing students’ time use. Journal of Marketing Education, 28(2), 121–134. https://doi.org/10.1177/0273475306288400 Google Scholar
  • Nora, A. (2004). The role of habitus and cultural capital in choosing a college, transitioning from high school to higher education, and persisting in college among minority and nonminority students. Journal of Hispanic Higher Education, 3(2), 180–208. https://doi.org/10.1177/1538192704263189 Google Scholar
  • Nordell, S. E. (2009). Learning how to learn: A model for teaching students learning strategies. Bioscene, 35(1), 35–42. Google Scholar
  • Oakley, B. (2014). A mind for numbers. New York, NY: Penguin Group. Google Scholar
  • Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology, 8, 1–28. https://doi.org/10.3389/fpsyg.2017.00422 MedlineGoogle Scholar
  • Perin, D., & Holschuh, J. P. (2019). Teaching academically underprepared postsecondary students. Review of Research in Education, 43, 363–393. https://doi.org/10.3102/0091732X18821114 Google Scholar
  • Perry, N. E. (2002). Using qualitative methods to enrich understandings of self-regulated learning. Educational Psychologist, 37(1), 1–3. https://doi.org/10.1207/00461520252828500 Google Scholar
  • Pianta, R. C., Hamre, B. K., & Allen, J. P. (2012). Teacher-student relationships and engagement: Conceptualizing, measuring, and improving the capacity of classroom interactions. In Christenson, S. L.Reschly, A. L.Wylie, C. (Eds.), Handbook of research on student engagement (pp. 365–386). New York, NY: Springer. Google Scholar
  • Pintrich, P. R. (2002). The role of metacognitive knowledge in learning, teaching, and assessing. Theory into Practice, 41(4), 219–225. https://doi.org/10.1207/s15430421tip4104_3 Google Scholar
  • Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educational and Psychological Measurement, 53, 801–813. Google Scholar
  • Pintrich, P. R., & Zusho, A. (2007). Student motivation and self-regulated learning in the college classroom. In Perry, R. P.Smart, J. C. (Eds.), The scholarship of teaching and learning in higher education: An evidence-based practice (pp. 731–810). New York, NY: Springer. Google Scholar
  • Pittman, L. D., & Richmond, A. (2008). University belonging, friendship quality, and psychological adjustment during the transition to college. Journal of Experimental Education, 76(4), 343–362. https://doi.org/10.3200/JEXE.76.4.343-362 Google Scholar
  • Putnick, D. L., & Bornstein, M. H. (2016). Measurement invariance conventions and reporting: The state of the art and future directions for psychological research. Developmental Review, 41, 71–90. https://doi.org/10.1016/j.dr.2016.06.004 MedlineGoogle Scholar
  • Rachal, K. C., Daigle, S., & Rachal, W. S. (2007). Learning problems reported by college students: Are they using learning strategies? Journal of Instructional Psychology, 34(4), 191–199. Google Scholar
  • Ramirez, J. J. (2020). Undergraduate neuroscience education: Meeting the challenges of the 21st century. Neuroscience Letters, 739, https://doi.org/10.1016/j.neulet.2020.135418 MedlineGoogle Scholar
  • Reeve, J. (2002). Self-determination theory applied to educational settings. In Deci, E. L.Ryan, R. M. (Eds.), Handbook of self-determination theory (pp. 183–203). Rochester, NY: University of Rochester Press. Google Scholar
  • Robbins, S. B., Lauver, K., Le, H., Davis, D., Langley, R., & Carlstrom, A. (2004). Do psychosocial and study skill factors predict college outcomes? A meta-analysis. Psychological Bulletin, 130(2), 261–288. https://doi.org/10.1037/0033-2909.130.2.261 MedlineGoogle Scholar
  • Roediger, H. L., Weinstein, Y., & Agarwal, P. K. (2010). Forgetting: Preliminary considerations. In Della Sala, S. (Ed.), Forgetting (pp. 1–22). East Sussex, UK: Psychology Press. Google Scholar
  • Rotgans, J. I., & Schmidt, H. G. (2014). Situational interest and learning: Thirst for knowledge. Learning and Instruction, 32, 37–50. https://doi.org/10.1016/j.learninstruc.2014.01.002 Google Scholar
  • Rovers, S. F. E., Stalmeijer, R. E., van Merriënboer, J. J. G., Savelberg, H. H. C. M., & de Bruin, A. B. H. (2018, December). How and why do students use learning strategies? A mixed methods study on learning strategies and desirable difficulties with effective strategy users. Frontiers in Psychology, 9, 1–12. https://doi.org/10.3389/fpsyg.2018.02501 MedlineGoogle Scholar
  • Rytkonen, H., Parpala, A., Lindblom-Ylianne, S., Virtanen, V., & Postareff, L. (2012). Factors affecting bioscience students’ academic achievement. Instructional Science, 40(2), 241–256. https://doi.org/10.1007/s11251-011-9176-3 Google Scholar
  • Sabel, J. L., Dauer, J. T., & Forbes, C. T. (2017). Introductory biology students’ use of enhanced answer keys and reflection questions to engage in metacognition and enhance understanding. CBE—Life Sciences Education, 16(3), 1–12. https://doi.org/10.1187/cbe.16-10-0298 Google Scholar
  • Scheithauer, M. C., & Kelley, M. L. (2017). Self-monitoring by college students with ADHD: The impact on academic performance. Journal of Attention Disorders, 21, 1030–1039. https://doi.org/10.1177/1087054714553050 MedlineGoogle Scholar
  • Schermelleh-Engel, K., & Moosbrugger, H. (2003). Evaluating the fit of structural equation models: Tests of significance and descriptive goodness-of-fit measures. Methods of Psychological Research Online, 8, 23–74. Google Scholar
  • Schraw, G., & Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7(4), 351–371. https://doi.org/10.1007/BF02212307 Google Scholar
  • Schutz, P. A. (2020). Race focusing and reimaging research: Where do we go from here? Contemporary Educational Psychology, 61, 1–4, Advance online publication. https://doi.org/10.1016/j.cedpsych.2020.101871 Google Scholar
  • Sebesta, A. J., & Speth, E. B. (2017). How should I study for the exam? Self-regulated learning strategies and achievement in introductory biology. CBE—Life Sciences Education, 16(2), 1–12. https://doi.org/10.1187/cbe.16-09-0269 Google Scholar
  • Sörbom, D. (1989). Model modification. Pyschometrika, 54, 371–384. Google Scholar
  • Stanton, J. D., Neider, X. N., Gallegos, I. J., & Clark, N. C. (2015). Differences in metacognitive regulation in introductory biology students: When prompts are not enough. CBE—Life Sciences Education, 14(2), 1–12. https://doi.org/10.1187/cbe.14-08-0135 Google Scholar
  • Stanton, J. D., Sebesta, A. J., & Dunlosky, J. (2021). Fostering metacognition to support student learning and performance. CBE—Life Sciences Education, 20(2), fe3. https://doi.org/10.1187/cbe.20-12-0289 LinkGoogle Scholar
  • Steel, P. (2007). The nature of procrastination: A meta-analytic and theoretical review of quintessential self-regulatory failure. Psychological Bulletin, 133(1), 65–94. https://doi.org/10.1037/0033-2909.133.1.65 MedlineGoogle Scholar
  • Steel, P., & Konig, C. J. (2006). Integrating theories of motivation. Academy of Management Review, 31(4), 889–913. Google Scholar
  • Svinicki, M. D. (2010). Student learning: From teacher-directed to self-regulation. New Directions for Teaching and Learning, 123, 73–83. https://doi.org/10.1002/tl Google Scholar
  • Tanaka, A., & Murayama, K. (2014). Within-person analyses of situational interest and boredom: Interactions between task-specific perceptions and achievement goals. Journal of Educational Psychology, 106(4), 1122–1134. https://doi.org/10.1037/a0036659 Google Scholar
  • Tanner, K. D. (2012). Promoting student metacognition. CBE—Life Sciences Education, 11(2), 113–120. https://doi.org/10.1187/cbe.12-03-0033 LinkGoogle Scholar
  • Tempelaar, D. T., Rienties, B., Giesbers, B., & Gijselaers, W. H. (2015). The pivotal role of effort beliefs in mediating implicit theories of intelligence and achievement goals and academic motivations. Social Psychology of Education, 18(1), 101–120. https://doi.org/10.1007/s11218-014-9281-7 Google Scholar
  • Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition. Chicago: University of Chicago Press. Google Scholar
  • Tinto, V. (2017). Reflections on student persistence. Student Success, 8(2), 1–8. https://doi.org/10.5204/ssj.v8i2.376 Google Scholar
  • Tovar, E., & Simon, M. A. (2006). Academic probation as a dangerous opportunity: Factors influencing diverse college students’ success. Community College Journal of Research and Practice, 30(7), 547–564. https://doi.org/10.1080/10668920500208237 Google Scholar
  • Toven-Lindsey, B., Levis-Fitzgerald, M., Barber, P. H., & Hasson, T. (2015). Increasing persistence in undergraduate science majors: A model for institutional support of underrepresented students. CBE—Life Sciences Education, 14(2), 1–12. https://doi.org/10.1187/cbe.14-05-0082 Google Scholar
  • Truschel, J., & Reedy, D. L. (2009). National survey—What is a learning center in the 21st century? Learning Assistance Review, 14(1), 9–22. Google Scholar
  • Tsui, L. (2016). Effective strategies to increase diversity in STEM fields: A review of the research literature. Journal of Negro Education, 76(4), 555–581. Google Scholar
  • Tuckman, B. W. (1991). The development and concurrent validity of the Tuckman Procrastination Scale. Educational and Psychological Measurement, 51, 473–489. Google Scholar
  • Tuckman, B. W. (2005). Relations of academic procrastination, rationalizations, and performance in a web course with deadlines. Psychological Reports, 96, 1015–1021. MedlineGoogle Scholar
  • Tuckman, B. W., & Kennedy, G. J. (2011). Teaching learning strategies to increase success of first-term college students. The Journal of Experimental Education, 79(4), 478–504. https://doi.org/10.1080/00220973.2010.512318 Google Scholar
  • Usher, E. L., & Pajares, F. (2007). Self-efficacy for self-regulated learning: A validation study. Educational and Psychological Measurement, 68(3), 443–463. https://doi.org/10.1177/0013164407308475 Google Scholar
  • van der Meer, J., Jansen, E., & Torenbeek, M. (2010). “It’s almost a mindset that teachers need to change”: First-year students’ need to be inducted into time management. Studies in Higher Education, 35(7), 777–791. https://doi.org/10.1080/03075070903383211 Google Scholar
  • Van Eerde, W. (2000). Procrastination: Self-regulation in initiating aversive goals. Applied Psychology, 49(3), 372–389. https://doi.org/10.1111/1464-0597.00021 Google Scholar
  • van Eerde, W., & Klingsieck, K. B. (2018). Overcoming procrastination? A meta-analysis of intervention studies. Educational Research Review, 25, 73–85. https://doi.org/10.1016/j.edurev.2018.09.002 Google Scholar
  • van Vliet, E. A., Winnips, J. C., & Brouwer, N. (2015). Flipped-class pedagogy enhances student metacognition and collaborative-learning strategies in higher education but effect does not persist. CBE—Life Sciences Education, 14(3), 1–10. https://doi.org/10.1187/cbe.14-09-0141 Google Scholar
  • Vodanovich, S. J., Wallace, J. C., & Kass, S. J. (2005). A confirmatory approach to the factor structure of the Boredom Proneness Scale: Evidence for a two-factor short form. Journal of Personality Assessment, 85(3), 295–303. https://doi.org/10.1207/s15327752jpa8503_05 MedlineGoogle Scholar
  • Wang, J., & Wang, X. (2019). Structural equation modeling: Applications using Mplus (2nd ed.). Hoboken, NJ: Wiley. Google Scholar
  • Weinstein, C. E., Husman, J., & Dierking, D. R. (2000). Self-regulation interventions with a focus on learning strategies. In Boekaerts, M.Pintrich, P. R.Zeidner, M. (Eds.), Handbook of self-regulation (pp. 727–747). Academic Press. https://doi.org/10.1016/b978-012109890-2/50051-2 Google Scholar
  • White, J. L., Altschuld, J. W., & Lee, Y. F. (2008). Evaluating minority retention programs: Problems encountered and lessons learned from the Ohio Science and Engineering Alliance. Evaluation and Program Planning, 31(3), 277–283. https://doi.org/10.1016/j.evalprogplan.2008.03.006 MedlineGoogle Scholar
  • White, S. M., Riley, A., & Flom, P. (2013). Assessment of Time Management Skills (ATMS): A practice-based outcome questionnaire. Occupational Therapy in Mental Health, 29(3), 215–231. https://doi.org/10.1080/0164212X.2013.819481 Google Scholar
  • Williams, R. L., & Stockdale, S. L. (2004). Classroom motivation strategies for prospective teachers. Teacher Educator, 39, 212–230. Google Scholar
  • Wilson, Z. S., Holmes, L., DeGravelles, K., Sylvain, M. R., Batiste, L., Johnson, M., ... & Warner, I. M. (2012). Hierarchical mentoring: A transformative strategy for improving diversity and retention in undergraduate STEM disciplines. Journal of Science Education and Technology, 21(1), 148–156. https://doi.org/10.1007/s10956-011-9292-5 Google Scholar
  • Wolters, C. A. (2003a). Regulation of motivation: Evaluating an underemphasized aspect of self-regulated learning. Educational Psychologist, 38(4), 189–205. https://doi.org/10.1207/S15326985EP3804_1 Google Scholar
  • Wolters, C. A. (2003b). Understanding procrastination from a self-regulated learning perspective. Journal of Educational Psychology, 95(1), 179–187. https://doi.org/10.1037/0022-0663.95.1.179 Google Scholar
  • Wolters, C. A., & Benzon, M. B. (2013). Assessing and predicting college students’ use of strategies for the self-regulation of motivation. Journal of Experimental Education, 81(2), 199–221. https://doi.org/Doi10.1080/00220973.2012.699901 Google Scholar
  • Wolters, C. A., & Brady, A. C. (2020). College students’ time management: A self-regulated learning perspective. Educational Psychology Reviewhttps://doi.org/10.1007/s10648-020-09519-z Google Scholar
  • Wolters, C. A., & Hoops, L. D. (2015). Self-regulated learning interventions for motivationally disengaged students. In Cleary, T. (Ed.), Self-regulated learning interventions with at-risk youth: Enhancing adaptability, performance, and well-being (pp. 67–88). Washington, DC: APA Books. https://doi.org/10.1007/978-1-4419-1005-9_100338 Google Scholar
  • Wolters, C. A., Iaconelli, R., Peri, J., Hensley, L. C., & Kim, M. (2020, November 10). Investigating the effects of a learning-to-learn course. Council on Research in Student Progress and Educational Excellence Annual Conference at Columbus, OH. Google Scholar
  • Wolters, C. A., Won, S., & Hussain, M. (2017). Examining the relations of time management and procrastination within a model of self-regulated learning. Metacognition and Learning, 12(3), 381–399. https://doi.org/10.1007/s11409-017-9174-1 Google Scholar
  • Worthington, R. L., & Whittaker, T. A. (2006). Scale development research: A content analysis and recommendations for best practices. Counseling Psychologist, 34(6), 806–838. https://doi.org/10.1177/0011000006288127 Google Scholar
  • Yazedjian, A., Toews, M. L., Sevin, T., & Purswell, K. E. (2008). It’s a whole new world: A qualitative exploration of college students’ definitions of and strategies for college success. Journal of College Student Development, 49(2), 141–154. https://doi.org/10.1353/csd.2008.0009 Google Scholar
  • Yosso, T. J. (2005). Whose culture has capital? A critical race theory discussion of community cultural wealth. Race Ethnicity and Education, 8(1), 69–91. https://doi.org/10.1080/1361332052000341006 Google Scholar
  • Young, M. R. (2005). The motivational effects of the classroom environment in facilitating self-regulated learning. Journal of Marketing Education, 27(1), 25–40. https://doi.org/10.1177/0273475304273346 Google Scholar
  • Zepeda, C. D., Richey, J. E., Ronevich, P., & Nokes-Malach, T. J. (2015). Direct instruction of metacognition benefits adolescent science learning, transfer, and motivation: An in vivo study. Journal of Educational Psychology, 107(4), 954–970. https://doi.org/10.1037/edu0000022.supp Google Scholar
  • Zhao, N., Wardeska, J., McGuire, S. Y., & Cook, E. (2014). Metacognition: An effective tool to promote success in college science learning. Journal of College Science Teaching, 43(4), 48–54. https://doi.org/10.2505/4/jcst14_043_04_48 Google Scholar
  • Zheng, L. (2016). The effectiveness of self-regulated learning scaffolds on academic performance in computer-based learning environments: A meta-analysis. Asia Pacific Education Review, 17(2), 187–202. https://doi.org/10.1007/s12564-016-9426-9 Google Scholar
  • Zimmerman, B. J., & Schunk, D. H. (2001). Reflections on theories of self-regulated learning and academic achievement. In Zimmerman, B. J.Schunk, D. H. (Eds.), Self-regulated learning and academic achievement: Theoretical perspectives (2nd ed., pp. 289–308). Mahwah, NJ: Erlbaum. Google Scholar
  • Zusho, A., Pintrich, P. R., & Coppola, B. (2003). Skill and will: The role of motivation and cognition in the learning of college chemistry. International Journal of Science Education, 25(9), 1081–1094. https://doi.org/10.1080/0950069032000052207 Google Scholar