ASCB logo LSE Logo

Faculty Beliefs about Intelligence Are Related to the Adoption of Active-Learning Practices

    Published Online:https://doi.org/10.1187/cbe.17-05-0084

    Abstract

    Mounting evidence of the efficacy of active learning has prompted educators to consider adoption of these practices in college-level classrooms. One tenet of active learning is that most, if not all, students have the ability to learn. Instructors’ perspectives on learning, however, may or may not be aligned with this. One belief held by some educators is that intelligence is fixed, that is, some students are more intelligent and have a higher ability to learn than others. Instructors with a fixed mindset may not be convinced that their investment in developing active-learning materials will be as fruitful as the education evidence suggests, because these instructors may not believe that most students can grow in their learning. Here, we explored the relationship between fixed mindsets and the adoption of active-learning strategies. We found that instructors with higher fixed mindsets were less persuaded that active-learning strategies were a good idea and less likely to implement the teaching practices. Our research suggests that development initiatives should explicitly address educators’ lay theories of intelligence (fixed or growth mindset) to support successful implementation of active learning.

    INTRODUCTION

    National initiatives in college-level education reform have emphasized active learning as a key area of focus for introductory courses (American Association for the Advancement of Science, 2011; President’s Council of Advisors on Science and Technology, 2012). Active learning is operationalized in the college classroom as a range of student-centered curricular events that engage students through, for example, peer collaboration, experimentation, and problem solving (Handelsman et al., 2007; Cavanagh et al., 2018). As evidence of the efficacy of active-learning strategies grows, there are increasing calls to train greater numbers of instructors1 to support these practices in the college science classroom (Bradforth et al., 2015).

    Evidence from research in education supports the benefits of active learning. For example, when instructors successfully implement active learning and see positive results from their efforts, they undergo positive transformations such as having increased desires to teach, feelings of self-efficacy in regard to teaching, increased confidence in their teaching, and a renewed passion for teaching (Guskey, 1985). Despite these instructor-centered and other student-centered advantages to the adoption of active learning, there are many real and perceived barriers to the adoption of active-learning practices among instructors. Educators who take a lecture-only approach do agree in general that an active and engaged college classroom is important, but those sentiments do not always translate into the actual implementation of active learning. This lack of translation is thought to be due in part to instructors’ perceived barriers, such as a lack of necessary class time, a strong comfort level with traditional lectures, and insufficient materials (Miller and Metz, 2014). Previous research suggests that, even when given specific training on how to incorporate active learning into the college classroom, college instructors who attended an intense, weeklong training to teach more actively within a large-classroom context were not necessarily implementing these practices in their own classrooms (Ebert-May et al., 2011).

    The commonly cited structural barriers to adoption of active learning, such as time and lack of rewards, capture only one facet of barriers to change. Another influential aspect is instructors’ beliefs about students’ abilities and beliefs about their own responsibility in teaching (Kagan, 1992). One potentially important factor is instructors’ beliefs about the malleability of intelligence. Beliefs about intelligence fall on a spectrum (Dweck, 1986). On one end is a growth mindset, in which intelligence is believed to be malleable, in that it can continuously grow with effort. On the other end is a fixed mindset, in which intelligence is viewed as finite, that is, a person eventually reaches a limit in his or her intelligence that he or she cannot move beyond.

    Research has shown extraordinary benefits to the growth mindset (e.g., see a review in Burnette et al., 2013). Individuals holding growth mindsets are less likely to draw negative conclusions about ability (their own or others), they are more likely to persevere through difficulties, and they hold sustained motivation and effort toward learning (Dweck, 1999; Heine et al., 2001; Blackwell et al., 2007). The underlying assumption in the growth mindset is that those who are committed to the learning process can rise to the challenges of learning and succeed.

    In contrast, individuals who endorse a fixed mindset respond with agreement to statements like “I believe that you have a certain amount of intelligence and you really can’t do much to change it” (theories of intelligence measure; Dweck et al., 1995). Fixed theories of intelligence, at best, hold that all individuals have a fixed but roughly equal intelligence or, at worst, that intelligence is heterogeneous, that is, there are people with high and low fixed levels of intelligence. Research shows that individuals who hold fixed mindsets are likely to draw negative conclusions about intellectual abilities based on setbacks, they are more likely to give up when faced with difficulties, and they lose motivation toward learning when faced with failure (Dweck, 1999; Heine et al., 2001; Blackwell et al., 2007). These findings imply that the fixed view encompasses a belief in the heterogeneous distribution of intelligence. The underlying assumption in a fixed-mindset theory of intelligence is that only those who are capable will learn. This assumption conflicts with active-learning innovations. For the instructor who truly believes that only the “smartest” students will understand complex topics, it may seem fruitless to make large investments of time and resources for essentially equivalent outcomes, because “only the bright students will understand anyway.”

    We hypothesized that the selection of teaching practices may be influenced by the mindset an individual holds. Active learning deliberately structures opportunities for students to practice skills in class, with the underlying assumption that not only the “smart” students, but most students can improve in their learning, because learning is a process of growth. The active-learning perspective aligns more with the growth-mindset view of intelligence than it does with the fixed mindset. Indeed, one study found that instructors holding a fixed mindset were less likely to encourage students to practice and try again, and instead placated students who they believed did not have the capability with statements such as “It’s ok, [science] isn’t for everyone” (Rattan et al., 2012). Therefore, college-level instructors with fixed mindsets may be less willing to change their teaching practices, because if one truly believes that only some students will learn, then efforts expended on creating an active-learning environment outweigh the potential benefits.

    EPIC-Implementation Model

    Adoption of new teaching strategies by instructors can be conceptualized as a multistage process. The model of adoption developed by Aragón and Graham (2015) recognizes five stages: exposure, persuasion, identification, commitment, and implementation (EPIC-implementation). This adoption model was designed to be inclusive of existing motivation, goal, and change theories (e.g., theory of planned behavior [Ajzen, 1991]; theory of reasoned action [Fishbein and Ajzen, 2011]; social cognitive theory [Bandura, 1991]; life-span theory of control [Heckhausen and Schulz, 1995]; goal achievement [Gollwitzer, 1993]; theory of goal setting and task performance [Locke and Latham, 1990]; diffusion of innovation theory [Rogers, 2010]; social interaction model [Rogers and Shoemaker, 1971]). The adoption process framework was developed as an open framework in which additional factors proposed to impact adoption (e.g., logistic concerns over classroom layouts [Baepler et al., 2014]; feelings of self-efficacy [Bandura, 1977]; balancing research and teaching [Wright, 2005]) might be tested. Relationships that have been uncovered between factors and steps in the model have not only identified factors relevant in the process (Aragón et al., 2017; Cavanagh et al., 2018), but also inform college science education reform leaders where interventions might need additional attention.

    Exposure.

    The first stage, exposure, establishes the premise that educators must be exposed clearly to the proposed teaching method. This first threshold pertains to learners’ basic memory for and understanding of the proposed change, that is, taking away from the intervention what it is that they are being asked to do (also indicated as essential in diffusion of innovation theory [Rogers, 2010] and the social interaction model [Rogers and Shoemaker, 1971]). Within a cohort of educators introduced to teaching innovations, widespread failure to meet this earliest threshold might indicate problems in the original communication of the desired practice.

    The exposure stage also can be impacted by instructors’ attitudes. Attitudes can interact with attitude-consistent and attitude-inconsistent information, leading to greater selection and elaboration or greater filtering out, discounting, or distortion of the incoming information. This interaction changes systematically, dependent on the level of controversy surrounding the topic, the strength of the attitude, and whether the attitude relates to the learner’s central values (Eagly et al., 1999). Including this step in the EPIC model reduces ambiguity about whether the information was communicated effectively, provides the opportunity to understand the relationships between attitudinal individual differences and exposure, and provides a foundation on which to build the subsequent steps of the model.

    Persuasion.

    The second step of the model posits that instructors must be persuaded that the proposed practice is a good idea. We do not suggest that the first step, exposure, causes persuasion, but rather we are suggesting that memory for and understanding of the proposition is a threshold that must be met for one to then deliberate whether or not a proposition is a good idea. University instructors are particularly autonomous.2 This sense of autonomy predicts their implementation of suggested classroom innovations (Gorozidis and Papaioannou, 2014) and emphasizes that university-level instructors must they themselves believe that the proposed teaching practices are a good idea. Organizational research has found that successful adoption of new strategies, in part, relies on employees being persuaded that the changes are of value (e.g., Herscovitch and Meyer, 2002). Work in the education literature converges on the idea of persuasion; for example, teachers from primary and secondary schools who report that their beliefs about a teaching practice were of value also increased their implementation efforts (Abrami et al., 2004). Indeed, it has been found across many studies that attempts to reform teaching toward student-centered approaches are reliant on the instructors’ persuasion, or buy-in, for their success (for a review, see Waugh and Punch, 1987).

    From the psychology literature, most models of motivation (e.g., theory of planned behavior [Ajzen, 1991]; theory of reasoned action [Fishbein and Ajzen, 2011]; social cognitive theory [Bandura, 1991]; life-span theory of control [Heckhausen and Schulz, 1995]; theory of goal setting and task performance [Locke and Latham, 1990]; goal achievement [Gollwitzer, 1993]) show consensus, theoretically, that being convinced a particular behavior or outcome is a good idea is essential to committing to the goal, which increases striving toward and attainment of that goal (for a review, see Gollwitzer and Sheeran, 2006). This early step is described within the model of action phases (Gollwitzer, 1990; also described within motivational models such as theory of reasoned action [Fishbein and Ajzen, 2011]; and within change models such as those presented by Rogers [2010] and Rogers and Shoemaker [1971]) as part of the “predecisional” phase in which the desirability of a proposition is considered. In this instance, with university instructors, persuasion was considered an important step in the autonomous adoption of changes to their practices. Aragón and Graham (2015) hypothesized that attitudes about the topic and prior held knowledge or experiences could impact persuasion substantially. For example, in an investigation using the EPIC model, Aragón et al. (2017) found that, when instructors held strong color-blind beliefs (i.e., all people should be considered the same), they were not persuaded that inclusive teaching practices that acknowledge the differences between people were a good idea. In the present paper, we hypothesized that instructors’ theories of intelligence could be related to the step of persuasion within the EPIC model, because the premise of fixed intelligence runs counterintuitive to the idea that every student can grow in his or her learning.

    Identification.

    The third step in the model is identification, which, much like persuasion, overlaps substantially with a variety of motivational theories (theory of planned behavior [Ajzen, 1991]; theory of reasoned action [Fishbein and Ajzen, 2011]; social cognitive theory [Bandura, 1991]; life-span theory of control [Heckhausen and Schulz, 1995]; theory of goal setting and task performance [Locke and Latham, 1990]; goal achievement [Gollwitzer, 1993]). This step is also considered in the “predecisional” phase, in which the deliberator is considering whether or not a particular behavior or outcome is good for him or her. Identification maps onto the idea of feasibility, or the likelihood that one personally might be able to accomplish a desired outcome or goal. We place this step after persuasion, not because we anticipate persuasion to cause identification, but because it is the persuasion itself that deems the behavior or outcome as desirable.

    In this step, the personal identification factor can be influenced through internal psychological factors, external personal factors, general factors that affect the individual, and/or a feeling that the activity is simply not a “fit” for him or her personally. For example, an instructor could think that an active-learning technique is a great idea, but feel as though she personally does not have the skill (either social or technical) to implement it. Another instructor might like an activity, but does not feel that he has enough time personally to develop it. Likewise, an instructor could be enthusiastic about an active-learning technique but lament that it would not work for her personally, because she teaches in an amphitheater-type classroom.

    Thinking that an activity is good (persuasion) and thinking that it is good for one’s self (identification) are dissociable constructs (Clarke, 1996; Henderson et al., 2011). For the EPIC model to be an effective diagnostic tool, it should provide information of where along the pathway improvements to interventions might be made. Therefore, it was deemed that these two constructs should be measured independently.

    Commitment.

    The fourth step, commitment, records the end of the predecisional phase, and leads to the preactional phase, in which an intention (Gollwitzer, 1990; “decision” stage in Rogers [2010] and Rogers and Shoemaker [1971]) has been formed to implement a given practice (Gorozidis and Papaioannou, 2014). While intentions are not one-to-one predictors of future actions (for a review, see Fishbein and Ajzen, 2011), including commitment in this model allows one to see not only the rate at which those who commit actually act, but also the factors to be included as possible reasons for adherence to or abandonment of one’s commitment. Factors anticipated to impact this late step between commitment and implementation are ones that are likely unexpected and external to the adopter (Gollwitzer and Sheeran, 2006). For instance, instructors might find that their universities will not provide the needed resources to implement a given practice, or instructors might later find it more difficult to locate materials than had originally been anticipated. Meeting the thresholds of the three previous steps (exposure, persuasion, and identification) before commitment does not indicate a causal process, but rather a qualitative process in which the addition of commitment should strengthen persistence to reach the goal. If persuasion and/or identification are missing, it is possible that individuals would still implement the teaching practice because of external pressures to implement the proposed practice (such as a request from their department) or would tentatively “try out” a practice without having absolute persuasion or identification.

    Implementation.

    The fifth step is implementation, wherein instructors actually incorporate routine use of the classroom practices. We anticipate that the step of implementation would be much like what Guskey (1985) described as a trial and error process, in which positive or negative student feedback and experiences help refine the adopted practices over time.

    In summary, the present study asks a novel question and one that is pragmatically important for the college classroom: If college-level instructors do not really believe that all students they encounter are capable of growth in their learning, will the instructors invest the requisite time and energy into new practices to help students reach their potential? We predicted that instructors’ mindsets would be most related to the stage of persuasion. Instructors with fixed mindsets may be less convinced that incorporating activities into their classes can help students improve, whereas instructors with growth mindsets may be more persuaded. In addition, we predicted that, because of lower persuasion, instructors with a fixed mindset would also be less likely to implement active-learning relative to instructors with a growth mindset.

    METHOD

    We tested these ideas with participants of the Summer Institutes on Scientific Teaching, a national training program in which college-level instructors participated in a 4-day intensive training in evidence-based active-learning practices. Workshop curricula focused on engaging students in 1) their own learning, 2) monitoring their own learning, and 3) discussions within groups and the whole classroom (Handelsman et al., 2007; Pfund et al., 2009).

    Participants

    All known past participants of the Summer Institutes from years 2004 to 2014 (n = 1179) were invited to participate via email. Of the 750 respondents who opened the link, 661 completed the critical variables of interest for this investigation. Of those, another 41 reported that they were not actively teaching. Therefore, 620 respondents were in our final sample (83% of those who logged in and agreed to participate): 362 were women, 258 were men. Respondents came from 254 universities. Respondents were not compensated but were promised a report of the findings as data analysis and writing were completed (Aragón and Graham, 2015).

    Our sample was on average 47.6 years old (SD = 10.6; age range 27–78 years old; 36 participants did not provide age information) and respondents primarily self-identified as white (83.0% white, 4.2% Hispanic/Latino, 3.5% Black/African American, 3.5% Asian, 5.2% other, and 3.1% did not provide ethnicity information). Current academic positions were as follows: 241 (39.7%) were tenure-track professors; 140 (23.1%) were senior or tenured professors; 171 (28.2%) were non–tenure track college or university instructors; 27 (4.4%) were administrators or in professional development; 28 (4.6%) were graduate students or in postdoctoral positions; and 13 (2.1%) did not supply this information. The majority of respondents were actively teaching, with 95.0% having taught within the past year. Our respondents had on average taught for 14.54 years (SD = 10.03).

    Instructors attending the Summer Institutes on Scientific Teaching as program participants are trained in principles about how people learn, how to use a variety of teaching methods to engage students, and how to assess their students’ learning progress (see www.summerinstitutes.org). For example, instructors learn how to design lectures from well-defined learning goals and objectives, a process described by Wiggins and McTighe (2005) as “backward design.” Instructors also learn how to choose from various activities to engage students; how to consider diverse types of learners; how different teaching approaches might be applied to engage learners with various social and ethnic backgrounds and learning approaches; how to create and administer formative and summative assessments; and how to foster metacognition.

    Materials and Procedures

    A private link to our online survey was emailed to Summer Institute participants requesting that they complete the approximately 15- to 30-minute Summer Institute Census Survey. Respondents were allowed to leave the survey and return within a week without losing their place, although nearly all respondents completed the survey in one session. After participants provided informed consent and agreed to participate, the survey began with the following general instructions: “This survey has two sections. In the first section we will ask questions related to scientific teaching. In the second section we will ask questions related to you. Please remember your spontaneous responses are important and there are no right or wrong answers.” The teaching practices questions were located in the first section of the survey, and the instructor mindset (i.e., theory of intelligence) variable was located in the second section.

    Measure of Teaching Practices.

    We aligned our survey questions with the Summer Institutes curriculum by considering information from four sources: the Scientific Teaching book on which most of the Summer Institutes curriculum is based (Handelsman et al., 2007), a taxonomy that included these active-learning strategies (Couch et al., 2015), a theoretical model for the evaluation of the Summer Institutes (National Science Foundation grant Transforming Undergraduate Education in STEM #1323258), and the personal experiences of the authors as participants or administrators of the Summer Institutes.

    We asked the respondents to consider seven active-learning practices: structuring class time to include activities that engage students in their own learning, using exercises that generate group discussion, using exercises that lead students to draw their own conclusions, encouraging students to generate class-wide discussions, implementing formative assessments (assessments while learning is occurring) that inform students’ progress toward desired outcomes, encouraging students to think of science within the context of society, and identifying students’ misconceptions so that they may be corrected. In total, there were 19 prompts focused on three scientific teaching practices: 1) active learning; 2) formative assessment; and 3) inclusive teaching. The present study is focused on active learning, which was represented in seven of the 19 prompts.

    The descriptions of the seven active-learning practices were written down a column on the left side of the page. To the right of each active-learning description was a row of six boxes. Participants were instructed that they were to check the boxes when they were in agreement with the statement that labeled the box. Through this method, respondents provided us with a binary endorsement (yes or no) for each of the following statements3: 1) I was exposed to this, 2) I was convinced that this is good, 3) This is compatible with my teaching, 4) I made a decision to incorporate this in my teaching, and 5) I implemented this teaching practice in my course. (See Figure 1.)

    FIGURE 1.

    FIGURE 1. Bar graph depicting the mean number of active-learning teaching practices endorsed along each step of the adoption process. The error bars represent ±2 standard errors of the mean. As participants progressed down the proposed process of adoption, significantly fewer teaching practices were endorsed.

    To familiarize respondents with the procedure, we first provided them with hypothetical responses and described how they would be interpreted. We further explained, “We will interpret these data by the boxes that you check. Please read each statement in the left column, and then please check ALL boxes that apply. If you do not check a box we will assume that you do not endorse the statements related to it.” At the bottom of each page, participants were asked, “Are you sure you checked ALL boxes that apply?” with “yes” and “no” options. If participants responded “no,” the survey took them back to the top of the page and in red font requested that they please check all boxes that applied. Participants were redirected in this way until the respondents indicated that they had checked all boxes that applied. This format followed past work that also reports the adoption process model (Cavanagh et al., 2016; Aragón et al., 2017). The structure of the questions made it possible for respondents to consider each aspect through each stage of the adoption process independently, and it did not require endorsement of any stage before the next.

    Data were aggregated across teaching practices for each step of the adoption process. Scores ranged from 0 (indicating that no teaching practices were endorsed) to 7 (indicating that all teaching practices were endorsed). Crucial to our investigation, a majority of respondents (71%) recalled being exposed to all seven of these practices. For the remaining 29%, the distribution was as follows: 15% recalled six practices, 7% recalled five practices, 4% recalled four practices, <1% recalled three practices, 2% recalled two practices, 2% recalled one practice, and 1% recalled no practices. Year of attendance had no relationship to the number of practices respondents reported being exposed to, r = −0.03, p = 0.51, highlighting that even those respondents who had attended a decade earlier recalled the seven practices at a rate nearly equivalent to those who had recently attended, thus justifying the inclusion of all years in the sample. The results reported below are also significant and in the same direction if we include in the analysis only those respondents who recalled all seven of the teaching practices introduced to them. Descriptive statistics and tests of the measure’s internal structure are presented in Table 1.

    TABLE 1. Descriptive statistics for outcome variables

    Correlation matrixDescriptive statisticsCFA
    EPICIMPMeanSDSkewnessKurtosisKR20CFIRMSEA
    EExposure1.006.371.33−2.828.370.790.9960.001
    PPersuasion0.6471.005.941.63−1.883.200.780.9850.038
    IIdentification0.3390.5841.005.551.75−1.270.880.740.9690.059
    CCommitment0.4290.6850.6751.005.231.90−1.080.360.760.9660.062
    IMPImplementation0.4030.5180.5730.7011.004.572.06−0.65−0.530.750.9810.044

    KR20, Kuder-Richardson 20; RMSEA, root mean square error of approximation.

    Some of our respondents had come from the same home institutions, which raised the question of the independence of their data. Therefore, we ran linear mixed models for each outcome variable with the participants’ home institution entered as a random effect and found that cohorts of participants who came from the same home institution did not respond more similarly to one another in regard to the five outcome variables. Intraclass correlations ranged from 0.001 to 0.035, and from p = 0.23 to p = 0.989 for the random intercept. We also tested the assumptions of our data that deem them viable for linear regression. Graphed standardized predicted values with the standardized predicted residuals that were fitted with a Loess curve showed the relationships from the predictors to the outcome variables to be linear (near-zero relationship between the variables) and clustered around zero. Q-Q plots showed points generally clustered along the trend line with slight exception at the tails, indicating a somewhat normal distribution of the residuals.

    Measure of Implicit Theory of Intelligence (Growth/Fixed Mindset).

    We used a validated measure of implicit theories of intelligence intended to capture attribution of intelligence within another or one’s self (this sample α = 0.95, three items; Dweck et al., 1995). Items were “I believe that you have a certain amount of intelligence and you really can’t do much to change it,” “I believe that your intelligence is something about you that you can’t change very much,” and “I believe that you can learn new things, but you can’t really change your basic intelligence.” Likert-type scales with no middlemost neutral point followed the protocol of Dweck and colleagues (1995), were provided to indicate levels of agreement (1 = strongly disagree; 6 = strongly agree). The three items were averaged to create a single score in which a high score on this measure indicated a fixed mindset and a low score on this measure indicated a growth mindset. These educators were self-selected as individuals who committed time to attend a workshop to improve their teaching practices. One might think that this would incline them to be more growth-mindset biased, and the data support this. The mean for the fixed measure was 2.84, with an SD of 1.27 on this 1 to 6 scale (35.5% of the educators fell into the range of agreement with the fixed-mindset items, 64.5% fell into the range of disagreement with the fixed-mindset items).

    Analysis

    Preliminary Analyses.

    Descriptive statistics showed a decline in the number of active-learning strategies that were endorsed along each stage of the EPIC model.

    Main Analyses.

    We fitted our EPIC model using structural equation modeling (SEM), including the theory of intelligence factor, to identify whether and where along this EPIC process of adoption the theory of intelligence factor may be related to reported implementation of active learning in the classroom.4

    RESULTS

    Model Fit and Estimation Methods

    Model fit was assessed using multiple fit indices. We report not only the model chi-square and degrees of freedom, but also a confirmatory fit index (CFI) and two measures of absolute fit (RMSEA and SRMR) as suggested by Hu and Bentler (1999). CFI values ≥0.95 are considered indicative for good model fit. RMSEA values <0.06 and SRMR <0.08 also indicated good model fit. Models were estimated using maximum-likelihood estimation with robust SEs that protect against slightly nonnormal data, which is commonly found with ordinal survey items. Modeling was carried out in R v. 3.4.3 (R Core Team, 2015) using the lavaan package (Rosseel, 2012).

    Model Fit

    We initially estimated a model with a single path through exposure, persuasion, identification, and commitment to implementation. We did not find acceptable model fit from this model (χ2 (df = 16) = 158.28, p < 0.0001; robust CFI = 0.94; robust RMSEA = 0.142; SRMR = 0.074). Modification indices, as well as prior work with the EPIC model (Aragón and Graham, 2015; Aragón et al., 2017; Cavanagh et al., 2016) showed that persuasion acts directly on commitment; we added this path to the model. This drastically increased the fit of the model, although the most stringent test was still significant (χ2 (df = 15) = 43.7, p < 0.0001; robust CFI = 0.989; robust RMSEA = 0.063; SRMR = 0.024). Previous reports of the EPIC model also showed direct paths from each stage of the adoption model to implementation (please see the Supplemental Material for the serial mediation results; Aragón and Graham, 2015; Aragón et al., 2017; Cavanagh et al., 2016); we added these pathways in the final model. With this addition, all fit statistics indicated that our model fitted the data (χ2 (df = 12) = 12.3, p = 0.42; robust CFI = 1.0; RMSEA = 0.008; SRMR = 0.011). This final model is represented in Figure 2.

    FIGURE 2.

    FIGURE 2. Conceptual model of all tested direct and indirect effects of instructors’ beliefs about student intelligence on the different stages of the adoption of active learning.

    Theory of Intelligence Mindset Analysis

    The focus of this paper was to explore whether instructors’ mindsets about student intelligence influenced their adoption of active learning and identifying where in the adoption process this influence occurs. An inspection of the parameter estimates indicated that mindset did influence the adoption process. Specifically, significant effects were observed at the persuasion and implementation stages (see Figure 3). A 1 SD increase in the belief that students’ intelligence is fixed was associated with a 0.12 SD decrease in the average degree of how persuaded instructors were by the evidence for different active-learning practices. Similarly, a 1 SD increase in the belief that student intelligence is fixed was associated with an additional 0.11 SD decrease in the implementation of active-learning practices. As a whole, the predictors in this model explained 53% of the variation in implementation of active learning.

    FIGURE 3.

    FIGURE 3. The final SEM showing direct and indirect effects of instructors’ beliefs about student intelligence on the adoption of active-learning practices.

    In Figure 3, values next to paths with one arrowhead represent standardized estimates the impact of the variable at the tail on the variable at the arrowhead. Values on double-headed lines represent correlation coefficients. Gray paths are not significant at the alpha = 0.05 level (*p ≤ 0.05; **p ≤ 0.001). Dotted paths indicate a negative relationship between the variables, and solid paths indicate a positive relationship. The proportion of variance explained (R2) by the complete set of predictors on each variable were R2persuasion = 0.427, R2identification = 0.343, R2commitment = 0.584, R2implementation = 0.528.

    DISCUSSION

    In the present study, we showed the importance of considering instructors’ theories of intelligence in the implementation of instructional practices like active learning that are beneficial to students. Science instructors who indicated having a fixed mindset reported implementing fewer active-learning practices. Our interpretation of this finding is that a fixed mindset, by definition, is less compatible with implementing active-learning teaching practices in the classroom—these practices rest on the premise that all students can learn through increased engagement. There were two important outcomes to our study to consider in more depth.

    First, instructors who held the belief that intelligence is fixed—that is, that students either “have it” or they don’t—reported being less persuaded that active-learning practices are a good idea. Instructors with a more fixed mindset also reported implementing fewer active-learning strategies proposed to them at the Summer Institutes as compared with similarly trained instructors holding a growth mindset. A fixed mindset favors the idea that intelligence varies among individuals, and not all individuals are equally as capable. The belief that all students can learn and grow, however, defines a growth mindset: therefore, by definition, a growth mindset is more aligned with the principles of active learning. As predicted, instructors having a growth mindset reported implementing more active-learning practices. These data illustrated that a fixed mindset (Dweck, 2012) may be an important difference to consider when attempting to persuade instructors toward pedagogical change.

    Second, after their participation in a 4-day training workshop focused on evidence-based teaching practices that included active-learning strategies, instructors in our sample followed the hypothesized process to adoption of active-learning practices—showing a process of exposure to persuasion, identification, commitment (EPIC), and then to implementation. The EPIC model allowed us to identify where along the process of adoption an intervention might be modified.

    In terms of practical implications, it is the stage of persuasion in the EPIC-implementation model (see Aragón et al., 2017) at which the present study indicated that a fixed mindset was related to implementation of active-learning teaching practices in the classroom. To preempt this, there are also distinct points in time during the Summer Institute curriculum—as well as other comparable instructor development programs—when an education about theories of intelligence could be productive. Interventions might benefit from providing educators with knowledge of the downstream effects of both growth and fixed mindsets, explicitly emphasizing the relationship between such mindsets and the implementation of active-learning innovations. Once provided the information, educators can then make their own informed decisions about their pedagogical practices.

    Limitations

    To note limitations of this work, the relationship between instructors’ theories of intelligence and their adoption of active-learning practices were based on self-reports. This is a limitation to this work, because it might reflect attempts to cast oneself in a socially desirable light. In regard to self-reports of strategy adoption and classroom implementation, this is not a direct measure of how instructors actually teach. Readers must keep this in mind as they consider our results—more commitment to active-learning strategies and more endorsement of their implementation by growth-mindset instructors may differ in actual practice. Because our results emphasize the relationship between mindset and perceptions of teaching practice adoption and implementation, future research should attempt to directly measure instructors’ implementation of active learning (e.g., through classroom observation).

    In addition, we fully acknowledge that causal assertions are not possible with the current design, as we captured all variables close in time. We do see promise, though, in the EPIC-implementation process model, particularly as a way of showing at what step problems might arise. Longitudinal designs, experimental alterations of theories of intelligence, and direct observations of teaching practices might help future research understand the hypothesized causal relations. Finally, our sample represented 53% of the 1179 attendees of the Summer Institutes. Thus, another limitation of the current work is that these instructors self-selected their participation at two points, first upon attending the SI, and again upon participating in the survey. These instructors may be especially motivated or conscientious and may represent a “best-case scenario” for implementation of active-learning strategies.

    Implications

    Although we do not know the impact a mindset intervention could have on instructors’ implementation of active learning, we can speculate how it might improve the likelihood of adopting active learning by looking at results from related fields such as marketing and organizational psychology. For example, mindset is shown to influence consumer behavior choices: fixed-mindset consumers have been shown to be more likely to select brands that reflect well on their self-­image, while growth-mindset consumers have been shown to be more likely to select brands that were in line with their goals to improve and learn new things (Murphy and Dweck, 2016). Mindset is also shown to be of influence in the leadership and work engagement literatures: growth-­mindset employees engage in their jobs in ways that are more likely to initiate change, not only change within themselves, but also within their organizations in meaningful, proactive ways (Caniëls et al., 2018). Moreover, interventions with an emphasis on a growth mindset are shown to aid in change across a range of human behaviors, including conflict resolution, chronic adolescent aggression, race relations, and individual willpower (Dweck, 2012).

    Thus, aligning instructors toward a growth mindset for the pedagogical changes of interest, such as active learning in the college science classroom, appears like it could be a fruitful intervention, given that shifting people’s beliefs can alter even basic human qualities (Dweck, 2012). The question then becomes, how to do it? Fortunately, the psychology and education literatures suggest a spectrum of evidence-based techniques for achieving an intervention that helps individuals either confront or change their theory of intelligence. These include relevance intervention, framing struggles as opportunities rather than permanent failings, case studies, and explicit examples of growth in ability over time (see Good et al., 2003; Hulleman and Harackiewicz, 2009; Paunesku et al., 2015; Yeager et al., 2016).

    CONCLUSION

    Understanding individual differences within instructors might provide insight into how to most effectively persuade them of evidence-based, best teaching practices. For instance, interpreting the results of the present research might indicate that a brief explanation of fixed versus growth mindsets is appropriate before introducing instructors to adopting active-learning teaching practices in their classrooms. This is vital; if instructors are not aware of the consequences of fixed mindsets, particularly how a fixed mindset might negatively impact their regard for their students, then they are less likely to be persuaded that active learning can be beneficial for the majority of students who encounter it, and the intervention, for many, will stop there. Once instructors understand the implications of a fixed mindset, they then can make their own informed decisions about their own teaching practices. For the implementation of active-learning practices, effectively delivering a message about mindsets might be important to the success of those educational reforms.

    FOOTNOTES

    1We use the terms “instructor” and “educator” to include graduate students, postdocs, tenure-track faculty, or non–tenure track faculty who teach at the college level.

    2It can be difficult to draw parallels from the organizational or educational literatures (K–12) to college-level educators, because of the differences in autonomy between these populations. For example, the factor of autonomy can reverse the antecedent-consequent order of persuasion–commitment–implementation. When teaching reform is not autonomous, it follows a pattern of adoption that begins with the mandated implementation that may eventually follow to persuasion and, last, a commitment to earnestly invest in the new initiative (Guskey, 1985).

    3We included the items “I was not exposed to this” and “I was exposed to this but it was not clear to me” to differentiate when participants did not recall a teaching practice versus when faculty were not clear about the practice. We also included the item “My specific plan to implement this is in progress” to provide an option for those who were in this stage of implementation.

    4In previous presentations of the EPIC model (Aragón and Graham, 2015; Aragón et al., 2017; Cavanagh et al., 2016), the model was tested through serial mediation to 1) test whether the adoption progressed in a manner that was as hypothesized through the exposure–persuasion–identification–commitment (EPIC) implementation model, and then 2) test where within this process other factors might be related. To provide continuity between past work and the current endeavor, we provide an analysis using serial mediation modeling in the Supplemental Material Appendix. The results are nearly identical to what is reported here using the SEM model.

    ACKNOWLEDGMENTS

    This research was made possible through a National Science Foundation Transforming Undergraduate Research in the Sciences (NSF-TUES) grant (NSF #1323258) and support from the Howard Hughes Medical Institute Professor Grant to Jo Handelsman.

    REFERENCES

  • Abrami, P. C., Poulsen, C., & Chambers, B. (2004). Teacher motivation to implement an educational innovation: Factors differentiating users and non-users of cooperative learning. Educational Psychology, 24(2), 201–216. doi: 10.1080/0144341032000160146 Google Scholar
  • Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. Google Scholar
  • American Association for the Advancement of Science. (2011). Vision and change: A call to action. Final report. Washington, DC. Google Scholar
  • Aragón, O. R., Dovidio, J. F., & Graham, M. J. (2017). Colorblind and multicultural ideologies are associated with faculty adoption of inclusive teaching practices. Journal of Diversity in Higher Education, 10(3), 201–215 Google Scholar
  • Aragón, O. R., & Graham, M. J. (2015). Census survey report for the Summer Institutes on scientific teaching. Washington, DC: National Academies of Science. Google Scholar
  • Baepler, P., Walker, J. D., & Driessen, M. (2014). It’s not about seat time: Blending, flipping, and efficiency in active learning classrooms. Computers & Education, 78, 227–236. Google Scholar
  • Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change, Psychological Review, 84(2), 191–215. MedlineGoogle Scholar
  • Bandura, A. (1991). Social cognitive theory of self-regulation. Organizational Behavior and Human Decision Processes, 50(2), 248–287. Google Scholar
  • Blackwell, L. S., Trzesniewski, K. H., & Dweck, C. S. (2007). Implicit theories of intelligence predict achievement across an adolescent transition: A longitudinal study and an intervention. Child Development, 78, 246–263. MedlineGoogle Scholar
  • Bradforth, S. E., Miller, E. R., Dichtel, W. R., Leibovich, A. K., Feig, A. L., Martin, D., … Smith, T. L. (2015). Improve undergraduate science education: It is time to use evidence-based teaching practices at all levels by providing incentives and effective evaluations. Nature, 523(7560), 282–285. MedlineGoogle Scholar
  • Burnette, J. L., O’Boyle, E., VanEpps, E. M., Pollack, J. M., & Finkel, E. J. (2013). Mindsets matter: A meta-analytic review of implicit theories and self-regulation. Psychological Bulletin, 139, 655–701. doi: 10.1037/a0029531 MedlineGoogle Scholar
  • Caniëls, M. C. J., Semeijn, J. H., & Renders, I. H. M. (2018). Mind the mindset! The interaction of proactive personality, transformational leadership and growth mindset for engagement at work. Career Development International. https://doi.org/10.1108/CDI-11-2016-0194 Google Scholar
  • Cavanagh, A. J., Aragón, O. R., Chen, X., Couch, B., Durham, M., Bobrownicki, A., … Graham, M. J. (2016). Student buy-in to active learning in a college science course. CBE—Life Sciences Education, 15(4), ar76. LinkGoogle Scholar
  • Cavanagh, A. J., Chen, X., Bathgate, M., Frederick, J., Hanauer, D. I., & Graham, M. J. (2018). Trust, growth mindset, and student commitment to active learning in a college science course. CBE—Life Sciences Education, 17(1), ar10. LinkGoogle Scholar
  • Clarke, J. S. (1996). Faculty receptivity/resistance to change, personal and organizational efficacy, decision deprivation and effectiveness in research I universities. In ASHE Annual Meeting Paper held November 3, 1996, in Memphis, TN. Google Scholar
  • Couch, B. A., Brown, T. L., Schelpat, T. J., Graham, M. J., & Knight, J. K. (2015). Scientific teaching: Defining a taxonomy of observable practices. CBE—Life Sciences Education, 14(1), ar9. LinkGoogle Scholar
  • Dweck, C. S. (1986). Motivational processes affecting learning. American Psychologist, 41(10), 1040–1048. Google Scholar
  • Dweck, C. S. (1999). Self-theories: Their role in motivation, personality and development. Philadelphia: Taylor and Francis/Psychology Press. Google Scholar
  • Dweck, C. S. (2012). Mindsets and human nature: Promoting change in the Middle East, the schoolyard, the racial divide, and willpower. American Psychologist, 67(8), 614–622. MedlineGoogle Scholar
  • Dweck, C. S., Chiu, C. Y., & Hong, Y. Y. (1995). Implicit theories and their role in judgments and reactions: A word from two perspectives. Psychological Inquiry, 6(4), 267–285. Google Scholar
  • Eagly, A. H., Chen, S., Chaiken, S., & Shaw-Barnes, K. (1999). The impact of attitudes on memory: An affair to remember. Psychological Bulletin, 125(1), 64–89. MedlineGoogle Scholar
  • Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., & Jardeleza, S. E. (2011). What we say is not what we do: Effective evaluation of faculty professional development programs. BioScience, 61(7), 550–558. Google Scholar
  • Fishbein, M., & Ajzen, I. (2011). Predicting and changing behavior: The reasoned action approach. New York: Psychology Press, Taylor and Francis Group. Google Scholar
  • Gollwitzer, P. M. (1990). Action phases and mind-sets. In Higgins, E. T.Sorrentino, R. M. (Eds.), Handbook of motivation and cognition: Foundations of social behavior (Vol. 2, pp. 53–92). New York: Guilford. Google Scholar
  • Gollwitzer, P. M. (1993). Goal achievement: The role of intentions. European Review of Social Psychology, 4(1), 141–185. Google Scholar
  • Gollwitzer, P. M., & Sheeran, P. (2006). Implementation intentions and goal achievement: A meta-analysis of effects and processes. Advances in Experimental Social Psychology, 38, 69–119. Google Scholar
  • Good, C., Aronson, J., & Inzlicht, M. (2003). Improving adolescents’ standardized test performance: An intervention to reduce the effects of stereotype threat. Journal of Applied Developmental Psychology, 24(6), 645–662. Google Scholar
  • Gorozidis, G., & Papaioannou, A. G. (2014). Teachers’ motivation to participate in training and to implement innovations. Teaching and Teacher Education, 39, 1–11. doi: 10.1016/j.tate.2013.12.001 Google Scholar
  • Guskey, T. F. (1985). The effects of staff development on teachers’ perceptions about effective teaching. Journal of Education Research, 78, 378–381. Google Scholar
  • Handelsman, J., Miller, S., & Pfund, C. (2007). Scientific teaching. New York: Macmillan. Google Scholar
  • Heckhausen, J., & Schulz, R. (1995). A life-span theory of control. Psychological Review, 102(2), 284. MedlineGoogle Scholar
  • Heine, S. J., Kitayama, S., Lehman, D. R., Takata, T., Ide, E., Leung, C., & Matsumoto, H. (2001). Divergent motivational consequences of success and failure in Japan and North America. Journal of Personality and Social Psychology, 81, 599–615. MedlineGoogle Scholar
  • Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. doi: 10.1002/tea.20439 Google Scholar
  • Herscovitch, L., & Meyer, J. P. (2002). Commitment to organizational change: Extension of a three-component model. Journal of Applied Psychology, 87(3), 474. MedlineGoogle Scholar
  • Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. Google Scholar
  • Hulleman, C. S., & Harackiewicz, J. M. (2009). Promoting interest and performance in high school science classes. Science, 326(5958), 1410–1412. MedlineGoogle Scholar
  • Kagan, D. M. (1992). Implication of research on teacher belief. Educational Psychologist, 27(1), 65–90. doi: 10.1207/s15326985ep2701_6 Google Scholar
  • Locke, E. A., & Latham, G. P. (1990). A theory of goal setting and task performance. Englewood Cliffs, NJ: Prentice-Hall. Google Scholar
  • Miller, C. J., & Metz, M. J. (2014). A comparison of professional-level faculty and student perceptions of active learning: Its current use, effectiveness, and barriers. Advances in Physiology Education, 38(3), 246–252. MedlineGoogle Scholar
  • Murphy, M. C., & Dweck, C. S. (2016). Mindsets shape consumer behavior. Journal of Consumer Psychology, 26(1), 127–136. Google Scholar
  • Paunesku, D., Walton, G. M., Romero, C., Smith, E. N., Yeager, D. S., & Dweck, C. S. (2015). Mind-set interventions are a scalable treatment for academic underachievement. Psychological Science, 26(6), 784–793. MedlineGoogle Scholar
  • Pfund, C., Miller, Sarah, Brenner, K., Bruns, P., Chang, A., Ebert-May, D., … Handelsman, J. (2009). Summer Institute to improve university science teaching. Science, 324, 470–471. MedlineGoogle Scholar
  • President’s Council of Advisors on Science and Technology. (2012) Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Washington, DC: U.S. Government Office of Science and Technology. Google Scholar
  • Rattan, A., Good, C., & Dweck, C. S. (2012). “It’s ok—Not everyone can be good at math”: Instructors with an entity theory comfort (and demotivate) students. Journal of Experimental Social Psychology, 48(3), 731–737. Google Scholar
  • R Core Team. (2015). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved May 1, 2018, fromwww.R-project.org Google Scholar
  • Rogers, E. M. (2010). Diffusion of innovations. New York: Simon and Schuster. Google Scholar
  • Rogers, E. M., & Shoemaker, F. F. (1971). Communication of innovations: A cross-cultural approach. New York: The Free Press. Google Scholar
  • Rosseel, Y. (2012). lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1–36. Retrieved May 1, 2018, from www.jstatsoft.org/v48/i02 Google Scholar
  • Waugh, R. F., & Punch, K. F. (1987). Teacher receptivity to systemwide changes in the implementation stage. Review of Educational Research, 57(3), 237–254. Google Scholar
  • Wiggins, G. P., & McTighe, J. (2005). Understanding by design. Alexandria, VA: ASCD. Google Scholar
  • Wright, M. (2005). Always at odds? Congruence in faculty beliefs about teaching at a research university. Journal of Higher Education, 76(3), 331–353. Google Scholar
  • Yeager, D. S., Walton, G. M., Brady, S. T., Akcinar, E. N., Paunesku, D., Keane, L., … Gomez, E. M. (2016). Teaching a lay theory before college narrows achievement gaps at scale. Proceedings of the National Academy of Sciences USA, 113(24), E3341–E3348. MedlineGoogle Scholar