ASCB logo LSE Logo

Through the Eyes of Faculty: Using Personas as a Tool for Learner-Centered Professional Development

    Published Online:https://doi.org/10.1187/cbe.19-06-0114

    Abstract

    College science instructors need continuous professional development (PD) to meet the call to evidence-based practice. New PD efforts need to focus on the nuanced blend of factors that influence instructors’ teaching practices. We used persona methodology to describe the diversity among instructors who were participating in a long-term PD initiative. Persona methodology originates from ethnography. It takes data from product users and compiles those data in the form of fictional characters. Personas facilitate user-centered design. We identified four personas among our participants: Emma the Expert views herself as the subject-matter expert in the classroom and values her hard-earned excellence in lecturing. Ray the Relater relates to students and focuses on their points of view about innovative pedagogies. Carmen the Coach coaches her students by setting goals for them and helping them develop skill in scientific practices. Beth the Burdened owns the responsibility for her students’ learning and feels overwhelmed that students still struggle despite her use of evidence-based practice. Each persona needs unique PD. We suggest ways that PD facilitators can use our personas as a reflection tool to determine how to approach the learners in their PD. We also suggest further avenues of research on learner-centered PD.

    INTRODUCTION

    College science instructors need to shift their teaching toward evidence-based practice, and many cannot do so without support. Many high-profile reports from workforce leaders, policy makers, and scientists ask instructors to reconsider teaching and learning. These calls encompass four broad areas. First, instructors should use progressive pedagogies, such as active learning, guided inquiry, and others that align with our growing knowledge of how people learn, including cognitive and affective factors (National Research Council, 2000; President’s Council of Advisors on Science and Technology, 2012; Freeman et al., 2014; Kapur, 2016; National Academies of Sciences, Engineering, and Medicine, 2018). Second, instructors should focus on core concepts, privileging depth over breadth and letting go of the struggle to cover all the material. To help with this transition, there is a focus on the five core concepts undergraduate biology majors should learn (American Association for the Advancement of Science [AAAS], 2011), which have been translated into learning objectives for a variety of life science courses (e.g., Brownell et al., 2014). Third, instructors should teach scientific practices, not just content. Students need to learn to apply the process of science; use models, simulations, and quantitative reasoning; and communicate their understanding in collaborative teams and to the public (AAAS, 2011). If courses focus on the acquisition of knowledge without opportunities to practice using that knowledge, students will leave college ill-equipped for science, technology, engineering, and mathematics careers or everyday decision making pertaining to science. Fourth, instructors should create inclusive classrooms that allow students from highly diverse backgrounds and experiences to engage in science (Haak et al., 2011; Estrada et al., 2016). These four calls illustrate that the days of the sage on the stage must end. It is no longer acceptable for college instructors to rely entirely on expert knowledge of the discipline. College teaching has become complex and demanding, and instructors need ongoing support to address these demands.

    In response, educational leaders have created a variety of professional development (PD) programs. At the national level, the Summer Institutes for Scientific Teaching brings together life science instructors for a weeklong workshop to design teachable units that incorporate active learning, assessments, and inclusive teaching approaches (Pfund et al., 2009). Similarly, the Cottrell Scholars Collaborative New Faculty Workshop engages chemistry instructors in the creation of instructional materials that employ evidence-based approaches (Baker et al., 2014). In the geosciences, the Cutting Edge Workshop invites instructors from multiple colleges to discuss how to improve geoscience teaching and provides an online platform for sharing instructional materials (Manduca et al., 2017). At a local level, many universities offer faculty learning communities (FLCs) within or across science departments. In FLCs, instructors define a teaching-related topic and pursue it collaboratively, generating a product at the end of their work, such as lessons for use in a common course (Cox, 2001, 2004; Elliott et al., 2016). Related initiatives engage groups of instructors in new approaches to assessment (McCourt et al., 2017), lesson design (Pelletreau et al., 2018), or curricula focused on primary literature (Stevens and Hoskins, 2014). These programs have made important contributions by raising the awareness and practice of evidence-based teaching for thousands of college science instructors (Beach and Cox, 2009; Ebert-May et al., 2011; Stains et al., 2015; Manduca et al., 2017). Yet there is a need for another step forward in PD. Evidence suggests many instructors who have participated in PD eventually abandon evidence-based practice (Henderson et al., 2012) or implement it in ways that do not promote student learning (Andrews et al., 2011). PD should go beyond guidance in pedagogical procedures and the exchange of ideas about teaching (Henderson et al., 2011; McCourt et al., 2017).

    Prior research shows that helping instructors modify their teaching involves a complex network of factors. One way to think about how classroom practice interacts with instructor thinking as well as the teaching context is the teacher-centered systemic reform model (TCSR; Gess-Newsome et al., 2003). In the TCSR, instructor thinking encompasses internal, individual thinking and includes knowledge, values, and related constructs (Gess-Newsome et al., 2003). Auerbach and colleagues illustrated the importance of instructor thinking (Auerbach et al., 2018). They showed that expert active-learning college instructors were more likely than novices to display both knowledge of students’ conceptual difficulties and strategies for holding students accountable for in-class work, monitoring and responding to student thinking during class and creating opportunities for generative cognitive engagement (Auerbach et al., 2018). This work, along with many other studies, suggests the thinking of college instructors needs to develop in broad ways that go beyond content knowledge (Auerbach et al., 2018) and superficial how-to knowledge of evidence-based teaching (Park et al., 2011; Sadler et al., 2013; Stains and Vickrey, 2017).

    Teaching context, which is another important aspect of the TCSR model, also influences the practice of college science instructors (Gess-Newsome et al., 2003). Instructors who are on board to develop their thinking come against situational barriers at the classroom, departmental, and institutional levels (Henderson and Dancy, 2007, 2011; Kezar, 2014; Bouwma-Gearhart et al., 2016). Many college instructors struggle with scaling up evidence-based practices for large-enrollment courses, and instructors perceive pressure from their colleagues and disciplines to move quickly through curricula because of content coverage expectations (Henderson and Dancy, 2007, Andrews and Lemons, 2015). Instructors often encounter dips in student evaluations during their transition to evidence-based practice (Allen et al., 2001; Seidel and Tanner, 2013), and the teaching evaluation systems of departments and institutions are not well-equipped to tolerate these periods of adjustment (Hornstein, 2017). Finally, few institutions have implemented policies and practices that sufficiently incentivize, recognize, and reward evidence-based practice (Corbo et al., 2016; Reinholz et al., 2017).

    Instructor thinking and teaching context interact in nuanced ways. Instructors who participate in PD bring with them a complex blend of thoughts and contextual issues (Ferrare and Hora, 2014; Hora, 2014; Lund and Stains, 2015; Auerbach and Andrews, 2018). It will not suffice to simply categorize college science instructors as instructor centered or student centered (Prosser et al., 1994; Hora, 2014; Smith et al., 2014), nor can we assume that all science instructors who practice traditional forms of teaching are doing so because they think it is the best pedagogy (Hora, 2014). A limited number of studies have characterized these complexities. For example, Hora identified 15 categories of instructor- or student-centered beliefs about student learning among a sample of science and math instructors (Hora, 2014) and found that most instructors held both instructor- and student-centered beliefs. Ferrare and Hora (2014) further showed that instructors’ enactment of beliefs about learning can be supported or constrained by their instructional contexts. Similarly, Henderson and Dancy (2007, 2011) found that situational barriers can lead to discrepancies between instructor thinking and practice. This important work suggests there is likely to be situational diversity among PD participants. PD designers need actionable ways to lead diverse participants.

    Persona methodology offers an ideal tool for PD designers. Personas are fictional characters that represent key characteristics from a user population of a specific product, and those personas are based on data from real users of the product (Pruitt and Adlin, 2006). Cooper created and introduced persona methodology to the product design industry to overcome the problem of designs that worked optimally for designers but were ill suited for the user base (Cooper, 1999; Pruitt and Adlin, 2006). Persona methodology draws from ethnography to describe how consumers use products in everyday contexts. These rich descriptions enable product design that meets users’ operational needs and individual preferences. Persona methodology also combats a second issue in the product design field: user reports that are informative but too lengthy, detailed, and cumbersome (Pruitt and Adlin, 2006). In contrast, personas communicate user information in robust, compact, easily digestible ways that are engaging. Personas also increase the memorability of data, because they provide characters, similar to the characters in a story (Denning, 2002; Grudin and Pruitt, 2002; Pruitt and Adlin, 2006).

    We see the persona methodology as a powerful tool for college science PD for several reasons. First, personas could capture the key discriminating features found across a large number of instructors. Second, personas could be concrete tools that change agents can refer to during PD planning and implementation. Third, by virtue of communicating information through a fictional human character, personas could evoke empathy in PD providers, a critical element if we intend to create a shared vision of education (Henderson and Dancy, 2011).

    Only a few studies have applied persona methodology to college instructors (Avgerinou and Andersson, 2007; Finelli et al., 2014; Madsen et al., 2014; Guy, 2017), and only two of these studies connect personas with PD. Specifically, Avgerinou and Andersson (2007) created personas of those aspiring to be online instructors as informational tools for other instructors and PD designers. In another study, Madsen and colleagues created five personas of physics instructors to inform the design of their online PD resources (Madsen et al., 2014). Specifically, they created personas based on instructors’ assessment needs and directly linked their personas to intentional features built into a website to support use of assessment innovations among university physics instructors (www.perusersguide.org). Persona methodology could be further applied to inform the design of PD for college science teaching across a variety of teaching and learning goals. In the study reported here, we used persona methodology to characterize the complex blend of instructor thinking, practice, and context across a sample of college biology instructors who were participating in long-term PD focused on evidence-based assessment practices (Haudek et al., 2011; McCourt et al., 2017). We created personas in order to understand how best to tailor future PD interventions. We asked the following research questions:

    1. What are the personas that exist in a PD community of biology instructors?

    2. What are the similarities and differences in desired PD outcomes among personas?

    We constructed four personas to capture and communicate the distinctive ways instructors think about teaching and the situational barriers they encounter when implementing evidence-based practices. Our personas provide an evidence-based, narrative description of the robust differences across instructors to help inform PD designers of their user base and how best to meet the needs of diverse instructors.

    METHODS

    Participants and Context

    This study involved 19 biology instructors from six R1 universities in the United States. The demographic data on participants’ gender, faculty position, and years of teaching experience are listed in Table 1. We did not collect data on participants’ race and ethnicity, and therefore cannot report them here. Participants’ teaching experience ranged from 6 to 31 years, with an average of 16 years. The number of participants at each university ranged from two to five. Participants were part of a 5-year national education initiative providing PD for users of Automated Analysis of Constructed Response (AACR) assessments (www.msu.edu/∼aacr). The initiative began in January 2014. AACR develops constructed-response assessments in topics across biology. Student written responses undergo computer-automated analysis to generate reports within minutes on the categories of student ideas present (Ha et al., 2011; Haudek et al., 2011, 2012; Urban-Lurain et al., 2013; Moharreri et al., 2014; Weston et al., 2015; Prevost et al., 2016). The AACR library contains more than 100 questions that instructors can select from and administer as a timely formative assessment alternative to multiple-choice questions.

    TABLE 1. Participant demographic informationa

    Position/titleGenderTeaching experience (years)
    Department chair, professorMale31
    Professor attendantFemale31
    Department head, professorMale24
    Associate professorFemale20
    Associate professorMale20
    ProfessorFemale20
    Senior instructorFemale18
    Associate professorFemale17
    Associate professorFemale16
    Associate professorMale15
    LecturerFemale15
    LecturerMale14
    Associate professorMale13
    Visiting assistant professorMale12
    Assistant professorMale10
    Academic specialistFemale8
    InstructorFemale8
    Assistant professorMale7
    InstructorMale6

    aData on race and ethnicity were not collected.

    Participants were recruited at their local institutions to participate in the 5-year initiative, which included attending local AACR group meetings facilitated by the local principal investigators of the AACR project. AACR groups met approximately three times per semester. The AACR group meetings focused on how to use AACR questions and interpret AACR reports on student responses. The meetings included many other topics, as well such as discussions about course curricula, course sequences, student thinking and behaviors in class, frustrations with teaching, and professional roles and responsibilities as an instructor. The direction of topic discussions was driven by the facilitator and meeting attendees.

    The instructors in this study comprise only one sample from the college biology instructor population across the United States and our analyses, which are described below, are not intended to produce broadly generalizable results. Rather, they are intended to characterize the instructor thinking of this particular sample.

    Data Collection

    In Spring 2014 and 2015, we conducted semistructured interviews with all participants, except two who were unavailable for interviews in 2015. We designed interviews to target instructors’ thinking on teaching and learning and their motivation to persist in PD. The interview script and results regarding AACR instructors’ motivation to persist in PD are reported by McCourt and colleagues (2017). We also collected at least two Classroom Observation Protocol for Undergraduate STEM (COPUS) classroom observations for participants every semester that they taught from Spring 2014 through Fall 2015 (Smith et al., 2013). The COPUS was adapted from the Teaching Dimensions Observation Protocol (Hora et al., 2013; Hora and Ferrare, 2014). We collected a total of 89 COPUS observations, ranging from two to 10 observations per participant. All research activities were approved by the University of Georgia Institutional Review Board (IRB protocol 00000257).

    Data Analysis

    Persona Construction.

    We used persona methodology to make meaning of our data. To create personas, we followed Pruitt and Adlin’s (2006) six-step process. We explain how we applied their steps to our study. Within the description of these steps, we refer to detailed procedures that we describe in latter sections of the Methods. It should be noted that instructor demographic data were not used in the development of the personas. Personas only reflect the instructors’ thinking and context found within the particular sample we studied. There are likely many other instructor personas in the U.S. population of college biology instructors.

    Step 1. Discuss categories of users involves determining what user information is important to examine for product design (Pruitt and Adlin, 2006). Here, the product we are interested in is PD, and the users are the individuals attending PD, who in this case are instructors. Operating under the framework of situated learning (Lave and Wenger, 1991; Kelly, 2006), we asked what information about instructors would best aid PD leaders in designing PD programs. In biology education, PD leaders often include 1) biology instructors who lead their colleagues, 2) postdocs who are hired to lead departmental PD, or 3) PD professionals who work at a center for teaching and learning (CTL). If PD leaders know what instructors value or the extent to which they invest in innovative teaching, the leaders may be able to plan and implement impactful PD experiences.

    Step 2. Process data involves extracting information, themes, and relationships from user data (Pruitt and Adlin, 2006). We analyzed our interview data by following standard qualitative coding procedures. Through initial coding (Strauss and Corbin, 1998), we developed themes that can be gleaned from the interview transcripts for how instructors think about teaching (Table 2). Coding is described in further detail under Coding.

    TABLE 2. Themes and theme descriptions used to characterize personas

    Theme Theme description: When instructors expressed …
    Knowledge of students What they know and aim to know about their students, including conceptions, tendencies, habits, or backgrounds 
    Teaching valuesWhat is important to them as teachers and what they want students to get out of the classroom and college experience
    Approaches to innovations An attitude or opinion regarding AACR questions and other innovative teaching practices 
    Perceived barriersa Knowledge about local, departmental, or institutional norms and customs that they view as barriers
    Desired PD outcomesWhat they want to know to facilitate their teaching 

    aPerceived barriers and desired PD outcomes were not included in the cluster analysis (see Cluster Analysis section) to determine the membership of personas but were used to help further characterize personas afterward.

    After transcripts were coded, we invited each of the six AACR PD facilitators to review our interpretations of their attendees gathered from the coding analysis. This review allowed us to leverage the knowledge of the facilitators, which was grounded in their experience with participants, to validate our claims about the instructors, prioritize dominant claims, and gain new insights. For the facilitator review process, we provided facilitators with individual coding summaries, discussed under Coding Summaries.

    To determine how best to sort individuals into groups, we performed a cluster analysis described in Cluster Analysis. We selected a five-cluster pattern (Figure 1, cluster results) and began to build persona skeletons as part of step 3.

    FIGURE 1.

    FIGURE 1. Dendrogram used for persona creation. We selected this dendrogram, which uses the average linkage (between groups) method from the other dendrograms generated in cluster analysis. Cluster analysis and dendrogram selection criteria are described in Methods: Cluster Analysis. The vertical dotted line represents the cutoff for having five clusters, labeled by the numbered boxes.

    Step 3. Identify and create skeletons (Pruitt and Adlin, 2006). Skeletons contain relevant data and descriptions that are used to further develop the persona character (Pruitt and Adlin, 2006). We created five skeletons that corresponded to the five-cluster pattern obtained in step 2 (Figure 1). We documented underlying themes for each skeleton.

    Step 4. Evaluate and prioritize skeletons involves deciding which skeletons will be developed into personas (Pruitt and Adlin, 2006). We made the decision to eliminate one skeleton that consisted of a single participant in the five-cluster grouping (Figure 1, cluster results). The data and descriptions of this participant were not unique enough to warrant a distinct persona, while the remaining four skeletons contained unique data. These four skeletons were developed into personas in step 5.

    Step 5. Develop skeletons into personas (Pruitt and Adlin, 2006). We used coding summaries to identify the key traits for each theme that emerged (Table 2). For example, for the persona that later became Ray the Relater, a common theme among the three individuals making up the persona was that they enjoy relating to their students. Once key traits were defined across all five themes, the most defining trait of each persona was used to create the persona’s name. We followed Pruitt and Adlin’s (2006) recommendation to employ alliteration for persona names to increase memorability. Persona methodology typically employs pictures as well. However, persona pictures have been shown to trigger biases and preconceptions in those who use them (Salminen et al., 2019). Therefore, we do not include pictures of our personas in order to avoid introducing any potential reader biases when they interpret our personas. We randomly assigned the gender of each persona using a free online computer program to randomly generate either 0 or 1 for female or male. Therefore, our persona descriptions have no connection to the persona’s gender. Each persona included instructors of both genders. Finally, representative quotes were selected from individuals within each persona to illustrate the key characteristics we found across personas.

    Step 6. Validate personas involves examining whether personas still reflect the real data after undergoing steps 1–5 (Pruitt and Adlin, 2006). To address validation, we obtained feedback from a group of approximately 15 biology education researchers that included faculty, postdocs, graduate students, and undergraduate research assistants. The group generally agreed that the personas aligned well with the data from interviews.

    Qualitative Analysis

    Coding.

    Interview transcripts were analyzed in MAXQDA v. 12. Multiple rounds of initial coding (Charmaz, 2006) were conducted on all 36 interview transcripts to construct a tentative list of codes. Codes centered around five major themes regarding instructors’ 1) knowledge of students, 2) teaching values, 3) approaches to innovations, 4) perceived barriers, and 5) desired PD outcomes (Table 2). Three researchers (P.Z., R.I., P.L.) independently coded eight of 36 transcripts (22%) and met to discuss their coding. They discussed disagreements until they could reach agreement on the code to assign. These discussions led to mutual understanding of codes, greater precision in code definitions, merging of codes that overlapped in meaning, and removal of codes that provided little explanatory power. This code reorganization resulted in 50 total codes (Supplemental Figure 1). Next, one coder (P.Z.) independently coded another eight transcripts, and the other consensual coders (R.I. and P.L.) checked the coding. The three researchers resolved disagreements through discussion. The same one coder (P.Z.) then independently coded the remaining 20 transcripts. Single codes could be applied multiple times within the same transcript, and multiple codes could be applied to the same segment within a transcript.

    Coding Summaries.

    Once all interviews were coded, coding summaries were generated for each participant to better understand the instructors and draw conclusions about their traits. Coding summaries were constructed by reviewing coded segments and making claims about each participant with supporting evidence in the form of quotes. Claims were organized based on themes from Table 2. The legitimacy of the coding summaries was checked through one-on-one discussions with AACR PD facilitators. Facilitators were asked if they agreed with the claims and for additional relevant insights on their experiences with their meeting attendees. This feedback informed persona creation.

    Cluster Analysis.

    Persona creation requires grouping individuals based on patterns in qualitative data. To explore possible groupings among AACR instructors, we performed cluster analysis. Cluster analysis is an exploratory technique that attempts to discover the underlying structure within a data set by grouping similar components of data. Our analysis clustered individual instructors based on the respective code frequencies we assigned to the interview transcripts. Instructor interview lengths had a mean, median, and mode of 1.8 hours and an SD of 0.469 (n = 19 instructors). Because one participant’s interview length was more than 3 hours, we normalized the counts of all code frequencies by their respective interview length. Several clustering methods exist, and they differ based on the distance metrics used. We used methods that combine hierarchical agglomerative cluster analysis with chi-square values or squared Euclidian distance values and complete or average linkage (DataCamp, Inc., n.d.; Wilks, 2011). All analyses were performed in IBM SPSS v. 24.

    For cluster analysis, the number of codes should not be greater than the number of individuals who are being clustered (Aldenderfer and Blashfield, 1984). Therefore, we narrowed the number of codes for cluster analysis. first, we only considered codes from three themes: knowledge of students, teaching values, and approaches to innovations (Table 2), because we were interested in grouping instructors based on their thinking about teaching. There were 34 codes in these three themes. Next, we eliminated six codes, because we found multiple, distinct ideas grouped under each code that would require further subcode analysis. For example, one of these codes was “Like or dislike AACR because it targets misconceptions” (Supplemental Figure 1), which contained reasons cited by instructors for liking AACR because it identified misconceptions in their students and reasons for disliking AACR because it targeted misconceptions known by the instructor who felt no new knowledge was gained. Therefore, we removed six codes that fell under these criteria to prevent ambiguity of which ideas within the code are clustering with the other stand-alone codes. Next, we determined which codes were most common across instructors (in order to represent multiple instructors) and varied the most in prevalence (in order to maximally capture distinctions within the data set for clustering). Doing so allowed us to eliminate nine codes that were infrequent or low in variability. We used the remaining 19 codes for cluster analysis, marked in Supplemental Figure 1.

    Because cluster analysis is an exploratory technique, we performed multiple cluster analyses with different distance and linkage algorithms. Each of these methods leads to a dendrogram that illustrates possible hierarchical relationships within the data (Arabie et al., 1996). To select the cluster method and accompanying dendrogram we would use for persona creation, we compared the dendrograms with one another and with our qualitative data. We determined which individuals clustered together across multiple dendrograms and excluded dendrograms with uncommon clustering patterns. We evaluated the remaining dendrograms in light of the qualitative analysis, asking ourselves which clustering method generated a dendrogram that captured the groupings we saw in the qualitative results. We selected the dendrogram shown in Figure 1, which was generated using the chi-square values and average-linkage clustering method. From this dendrogram, we used a grouping set that created five clusters (see the dotted line in Figure 1 with labeled squares marking five clusters), because five personas is the maximum number recommended by Pruitt and Adlin (2006). The five clusters here include one group of seven who became the Persona Emma, one group of five who became Carmen, two groups of three who became Ray and Beth, and a single individual who we ultimately excluded from further analysis because the individual did not show characteristics that were sufficiently distinct from the other personas.

    Many biologists recognize dendrograms, which are also used in phylogenetic systematics to represent evolutionary relationships among taxa. The dendrogram we present should be read like one used in phylogenetics, that is, the hierarchical relationships within the data should be interpreted by looking at the branch points (e.g., Novick and Catley, 2007). However, phylogenetic analysis often involves a statistical technique known as bootstrapping (Felsenstein, 1985) to generate multiple solutions by repeated resampling from the original data to establish a confidence measure. Bootstrapping depends on a large data set from which repeated samples can be drawn. Because we only had 19 participants, it was impossible for us to conduct bootstrapping with our data. Rather, we compared multiple dendrograms generated by different algorithm measures to identify common cluster relationships among participants. From there, we selected the cluster solution that was most consistent with our qualitative analysis, but it is not possible to assign a confidence interval to the likelihood that it is the best solution.

    COPUS Analysis.

    We input all classroom observations (n = 86) collected Spring 2014 through Fall 2015 for all AACR participating instructors, minus the one we eliminated in the cluster analysis, into the COPUS analyzer (www.copusprofiles.org; Stains et al., 2018). We used the provided “minute-by-minute template” for multiple instructors with multiple observations. The COPUS analyzer categorizes each classroom observation into one of seven COPUS clusters that can be collapsed into one of three COPUS clusters: didactic, interactive lecture or student-centered (Stains et al., 2018). Classrooms in the didactic category indicate a high majority of class time was spent lecturing (e.g., 80% or more). Classrooms in the interactive lecture category contain moderate levels of lecturing with student-centered interaction techniques such as clicker-question group work that supplement lecture, and classrooms in the student-centered category contain moderate to low levels of lecturing with distinctly large chunks of student-centered activities, such as group worksheets (Stains et al., 2018). We used the latter three-cluster categorization. COPUS data were not used in the creation of personas, because we wanted personas to be based on participants’ thinking about teaching and their teaching contexts. Rather, we determined the frequency of each of the three clusters for all individuals in each persona and described the average COPUS cluster distributions that resulted for each persona. Because the average COPUS profiles were used to describe each persona’s classroom practice, not all instructors making up a persona exhibited the same COPUS profile distribution. We include the COPUS profiles for each instructor in Supplemental Figure 2.

    RESULTS

    Here we present the results for our two research questions:

    1. What are the distinct personas that exist in a PD community of biology instructors?

    2. What are the similarities and differences in desired PD outcomes among personas?

    We address research question 1 in two sections focused on 1) explaining the themes we used to characterize the personas and 2) presenting the four personas. We end with findings from research question 2 that articulate the PD outcomes sought by each distinct persona.

    Research Question 1. What Are the Distinct Personas That Exist in a PD Community of Biology Instructors?

    Themes Used to Characterize Personas.

    To characterize AACR instructors for persona creation, we took the perspective of a change agent who wants to design PD experiences that move instructors toward greater expertise with evidence-based teaching. Thus, we structured our data analysis to focus on five themes that reveal instructors’ readiness to use and sustain evidence-based teaching (Table 2).

    For the theme knowledge of students, we coded instances when instructors demonstrated their knowledge of students in cognitive and affective areas. In the cognitive area, knowledge included students’ mixed-model thinking, which instructors describe as misconceptions. In the affective area, knowledge included students’ backgrounds, frustrations, and tendencies. For example, instructors sometimes talked about students’ views on college, their behaviors in class, or their engagement with course material. In addition, instructors revealed what they found students like or need during class, such as providing interesting real-life examples and incentives to work. Sometimes instructors expressed unproductive student tendencies as deficits that the instructor must overcome, but in other cases, instructors noted the same unproductive student tendencies as a starting point for guiding students to better habits. Knowledge of students informs PD initiatives by revealing where instructors in our study already possessed extensive knowledge and where knowledge would benefit from further development.

    For the theme teaching values (Table 2), we documented instances when instructors stated what was important to them in teaching, such as communicating clearly, engaging students, connecting with students, promoting peer–peer interactions, or preparing students to be successful in their upper-level courses and professional careers. This code also captured instructors’ goals for their students, for example, the development of scientific thinking or problem-solving skills. The teaching values theme reveals potential forces that guided and filtered the thinking and practices of instructors in our study. Change agents can use these as potential levers in PD initiatives.

    For the theme approaches to innovations (Table 2), we captured the reasons instructors cited for using or not using various evidence-based teaching practices, including backward design, formative assessment, and active learning. For example, some instructors said they implement strategies for gathering evidence of student thinking during class but admitted they are not always clear about how to respond to the evidence they gather. This theme also includes instructors’ perceptions of education research.

    For the theme perceived barriers (Table 2), we captured instructors’ perceived barriers in the classroom, department, institution, or academy. Many of the barriers mentioned by our instructors have been well documented in the literature, such as lack of time, expectations of content coverage, and class size (e.g., Henderson and Dancy, 2011). However, we included barriers as a criterion in persona characterization to provide change agents with insight into the degree to which perceived barriers differed across the instructors in our study.

    Finally, for the theme desired PD outcomes (Table 2), we documented instructors’ perceptions of resources they needed for their courses. This included vetted activities, assessment items, strategies for improving student motivation and engagement levels, or consensus learning goals. These are discussed further under Research Question 2.

    We sought to cluster AACR instructors based on differences in the way they thought about teaching. We performed cluster analysis using the interview codes that revealed our AACR instructor’s knowledge of students, teaching values, and approaches to innovations. Using the most salient cluster solution (see Methods), we established four personas. We present our personas below through description, illustrative quotes, and COPUS profiles. Although it is customary to present these types of data in the past tense, we present our personas in the present tense to help readers connect with them as characters.

    The Four Personas: Emma, Ray, Carmen, and Beth.

    Figure 2 summarizes the four personas. Emma the Expert sees herself as the subject-matter expert in the classroom and deeply values her well-developed pedagogical expertise for lecturing. Ray the Relater likes to relate to his students and considers their points of view on instructional approaches in class. Carmen the Coach coaches her students by setting goals and guiding her students during class to successfully reach those goals. Beth the Burdened takes full responsibility for the burden of student learning, which weighs heavily on her, especially given that students still struggle despite her solid efforts to implement evidence-based practice. When COPUS profiles for all instructors are examined, the didactic teaching style is most common, followed by interactive lecture, then student-centered instruction (Supplemental Figure 3). However, when examining COPUS profiles by persona, Emma and Ray teach primarily in didactic ways, while Carmen and Beth teach primarily in interactive ways (Figure 2 and Supplemental Figure 3). Yet, despite similar classroom practices, Emma and Ray think differently about teaching, as do Carmen and Beth. We describe Emma, Ray, Carmen, and Beth in the following text and figures.

    FIGURE 2.

    FIGURE 2. Summaries of personas. A comparison of the different personas is shown with a representative quote from each. For COPUS, class periods were categorized using the three-cluster COPUS classification scheme (Stains et al., 2018). Didactic classes are shown in gray, interactive lecture in orange, and student-centered in purple.

    We depict key points and quotes pertaining to personas’ knowledge of students, teaching values, approaches to innovations, perceived barriers, and classroom practices in Figures 26. Figure 2 provides a summary comparison of personas, while Figures 36 present the actual personas and their defining characteristics. We use the text below to elaborate on the findings presented in these figures.

    Emma the Expert.

    Emma, a primarily didactic teacher, expects students to do their own learning and works hard to craft lectures that draw students into biology.

    For Emma’s knowledge of students, she believes students should learn on their own. She provides students with necessary tools to succeed on her exams by assigning “a couple of hundred questions” to use in their studying (Figure 3, quote 1) and making herself available during office hours. She “loves to have 5 or 6 [students] in her office for hours,” working with them on problems. Emma becomes frustrated when she discovers students perform poorly on her exams and did not use the available resources, including her, the subject-matter expert. She knows students tend to treat course work passively, which she considers to be “tragic,” so she constantly reminds her students there are more effective study strategies. She confesses that “nagging” and “yelling” at students are not the best tactics, but she lacks knowledge of another way to get students to improve their studying.

    FIGURE 3.

    FIGURE 3. Emma the Expert. This figure provides an overview of Emma’s knowledge of students, teaching values, approaches to innovations, and perceived barriers.

    Regarding teaching values, Emma values keeping students enthusiastic about course material and finds students gain interest in learning when the material grabs their attention (Figure 3, quote 2). Emma takes it upon herself to make the content come to life through clear and memorable explanations and examples. For Emma, teaching always goes “back to enthusiasm.” If students see her “getting really excited about something or making connections,” they become “even more interested in the subject.” She hears students complain that she gets off-track, but Emma does not mind the complaints, because her tangents motivate students to do their own learning. Emma also values that her students can synthesize and apply information, not just memorize facts. She constantly tells her students that exams will include “definitional” questions and “applying something or adjusting some sort of hypothetical situation where something in the process is broken.

    Regarding approaches to innovations, Emma considers herself an expert in lecturing, a craft she honed over many years. She questions how well she can learn and implement other teaching approaches, hypothesizing it could take her years to successfully implement an evidence-based practice like flipping the classroom. She suspects her students may do worse during her learning curve (Figure 3, quote 3). Emma also believes evidence-based pedagogies work best for particular personality types. As an introvert, she feels the barriers to implementing evidence-based practices are higher for her than for others to whom teaching in evidence-based ways comes naturally (Figure 3, quote 4). Besides lack of experience and the right personality, Emma also shies away from evidence-based practices because of her knowledge of students’ tendencies. For instance, she bemoans students’ tendency to always want to memorize facts. She reasons that implementing in-class activities will lead to the same issue as lecturing, because students will still try to memorize the information from the activities (Figure 3, quote 5). Emma likes to use learning objectives and finds them useful, even if students do not meet them, because learning objectives make her instruction more organized (Figure 3, quote 6).

    Regarding perceived barriers, Emma describes four main barriers. First, academic culture deters Emma from putting more time into teaching. She acknowledges that she would learn a lot about her students’ thinking if she graded their exams but knows her department head would advise her not to grade (Figure 3, quote 7). Second, Emma sees her role in academia as maintaining the integrity of her discipline. This includes protecting the field’s high standards and simultaneously preparing potential scientists in her class for the challenging aspects of science (Figure 3, quote 8). Third, Emma feels a sense of duty to sort and rank students for their vocational fields, and this duty influences her use of instructional innovations. For example, we found Emma likes learning objectives, and she understands that, in theory, a student should earn an “A” if he or she meets all the learning objectives (Figure 3, quote 9). However, Emma points out that the educational system was traditionally set up for sorting students across a normal grade distribution. This system worked to inform postbaccalaureate institutions of students’ rankings relative to their peers. Thus, Emma views the traditional grading system as a barrier to fully using learning objectives in her class. Fourth, Emma struggles with the expectation of content coverage. For Emma, the content coverage pressure comes from her sense that upper-level courses depend on her, because her course is “a pre-req for upper-division courses,” and says that if you do not “cover something that you’re supposed to,” then students are not as prepared.

    When Emma’s class periods (n = 27) were observed using COPUS, 81% were didactic, 19% were interactive lecture, and none were student centered (Figure 2).

    Ray the Relater.

    Ray, who also primarily uses didactic teaching, loves to connect with his students and wants them to become life-long learners.

    Regarding knowledge of students, Ray demonstrates extensive knowledge of his students’ backgrounds, tendencies, and behaviors. We found Ray aims to relate to his students and thus understands their likes and needs. Unlike Emma, who primarily considers her lack of expertise with new teaching approaches, Ray takes students’ perspectives into account when considering the implementation of new pedagogies. For example, Ray considers students’ liking for lecture to be a problem for him to overcome (Figure 2, quote). Ray also mentions he would like to push his students more but worries about their comfort level (Figure 4, quote 1). Like Emma, Ray also expresses knowledge of students’ unproductive tendencies. For instance, Ray perceives his students come into his course wanting to be “spoon-fed” (Figure 4, quote 2), and he sees it is as his duty to support his students’ transition toward taking ownership of their education. Therefore, we see that Ray, again like Emma, wants to help students overcome unproductive habits. However, while Emma takes the approach of reminding and telling students about productive ways to learn, Ray relates to his students so he can better help them learn course material and lasting educational values.

    FIGURE 4.

    FIGURE 4. Ray the Relater. This figure provides an overview of Ray’s knowledge of students, teaching values, approaches to innovations, and perceived barriers.

    Regarding teaching values, Ray wants to connect with his students as a person, not just a professor. He looks forward to meeting a new cohort of students every year and following their development as professionals (Figure 4, quote 3). Because Ray tunes himself to students’ perspectives, he responds when he senses a lull in students’ engagement levels during class. Ray primarily responds to disengagement by telling tangential stories that capture students’ attention, as we saw with Emma. While Emma uses this approach to fulfill her teaching value of getting students interested in the subject, Ray delights in the fact that his personal stories help to humanize him and make him more relatable to students (Figure 4, quote 4). Ray finds this strategy of going off-topic also helps students learn, because it provides them with a brain break from lectures that are too full of content (Figure 4, quote 4). Ray also values equipping his students with the skills and knowledge needed to be successful in the world (Figure 4, quote 5).

    Regarding Ray’s approaches to innovations, he reveals that he implements formative assessment for a variety of reasons. Ray likes that in-class assessments inform his teaching by telling him if he is communicating the material clearly and if he needs to stop and revisit a concept (Figure 4, quote 6). Ray also feels “much more engaged in the process of what’s going on” when he can “watch [students] learn during the semester” instead of waiting until the end of the semester to test his students and not “hav[ing] any real connection to what’s happening.” Ray also likes to implement formative assessment, because he finds his students like it, particularly because it helps them prepare for exams (Figure 4, quote 7). Ray likes active learning and sees it as another strategy—in addition to his tangential stories—that he can use to engage students and provide them with a mental break from lecture (Figure 4, quote 8). Finally, Ray finds plugging into education research to be “really interesting” because “it’s really important if there’s research that shows us something is effective.” Ray views education research as a means to learn best practices. Ray especially wonders how best to balance lecture, tangential stories, and activities to improve student learning (Figure 4, quote 9).

    Ray perceives two main barriers in his teaching: class size and an expectation of content coverage. Ray emphasizes how strategies like asking questions to gauge student learning break down in large classes (Figure 4, quote 10). Ray also recognizes that implementing in-class assessments and activities “eats some time in lecture.” He worries that “the more [he] use[s] clickers, the less time [he has] to cover material” and questions if he really has to “drop some material.”

    When Ray’s class periods (n = 19) were observed using COPUS, 69% were categorized as didactic, 26% as interactive lecture, and 5% as student-centered (Figure 2).

    Carmen the Coach.

    Carmen, who primarily uses interactive lecture, focuses her energy on creating tasks and an environment where students can practice science.

    For Carmen’s knowledge of students, she aims to deeply understand students’ thinking and conceptual struggles. In class, she searches for opportunities to see student thinking, walking around during class activities and talking to students to see where they are stuck. She wants to know not only what students struggle with, but also the general nature and organization of their knowledge (Figure 5, quote 1). Carmen mentions she can tap into student thinking during office hours as well, like Emma, but does not rely solely on office hours to learn how students think. Carmen knows a lot about students’ mixed-model thinking, but she “would lay money we’re missing a lot” too and knows there is much to learn about students’ nonnormative ideas. She knows students in her class do not fully understand every concept, and she worries that these conceptual issues will linger. She would like to learn better ways to tackle persistent student conceptual struggles (Figure 5, quote 2). Carmen also recognizes students’ negative tendencies. She feels particularly frustrated when students disengage during in-class activities, because she believes students learn by actively engaging in the process of learning (Figure 5, quote 3).

    FIGURE 5.

    FIGURE 5. Carmen the Coach. This figure provides an overview of Carmen’s knowledge of students, teaching values, approaches to innovations, and perceived barriers.

    Regarding teaching values, Carmen values that students can do problem-solving and application tasks. Carmen, like Emma, tests these skills on her exams. Unlike Emma, Carmen provides students with guided in-class practice that mimics the problem sets they will see on her exams. She uses different problem contexts on exams to test students’ ability to apply information to novel contexts (Figure 5, quote 4). Carmen resembles Ray in the use of formative assessments to prepare students for the exam. Ray applies this approach because his students like having the practice, but Carmen uses formative assessment more systematically. Carmen values problem solving so much that she readily takes on the challenge of determining how to cut content in order to administer a new problem set (Figure 5, quote 5). To Carmen, a good problem-solving activity will “engage students in a meaningful way in thinking about core concepts while they use science practices” and focus on “creating and using a model or interpreting and analyzing data.” Carmen values eliciting student problem solving and scientific thinking during class because of the education research literature she reads. Carmen recalls data demonstrating that the brain activity of students watching TV was comparable to that of students listening to a lecture (Poh et al., 2010). She subsequently became terrified of student passivity during class (Figure 5, quote 6).

    Carmen, like Ray, values student engagement. When Ray notices disengaged students, he tells captivating, personal stories to watch students “perk up.” In contrast, Carmen battles disengagement by finding ways for all students to be interactive during class. Once, when she noticed that students sitting on the periphery of the room were “disconnected because of how far away” they were, she implemented a seat-rotation strategy “every couple of weeks” that brought students “on the fringes” right next to her. She finds “it’s much easier [for students] to be interactive if [they are] sitting there” right next to her.

    Regarding Carmen’s approaches to innovations, she likes and implements backward design. She considers how she can test students’ attainment of learning objectives and then determines what she can do in class to help students reach the objectives (Figure 2, quote). Unlike Emma, who uses learning objectives primarily to organize her lectures, Carmen uses learning objectives as a target for student learning. We found Carmen puts some of the burden of student learning on herself. When she discovers from her formative assessments that her students have a misconception, she immediately considers what she needs to do in response (Figure 5, quote 7). While Carmen regularly collects data on her students’ thinking through formative assessments, she often expresses that it is hard for her “to know how to use the data” and respond accordingly to “change what [she] did” in her teaching. She wonders whether it helps to “say the same thing just over again.” Finally, Carmen personally enjoys using student-centered pedagogies (Figure 5, quote 8). Like Ray, Carmen pays attention to how students feel about her interactive instructional approach. She finds that most “students like it, except for a small group of students that don’t like it, and they would prefer to get lectured to.” Unlike Ray, who focuses on students who are uncomfortable with innovative instruction, Carmen views these students as “special” outliers, and she subsequently carries forward with her interactive approach.

    Carmen perceives many of the common situational barriers that Ray and Emma experience, but Carmen actively tries to overcome these barriers. Carmen says it does not “match up” that she is “supposed to cover X amount of content” with “only X amount of time,” so she fights against the expectation of content coverage. She asks, if students “didn’t get it, then what good is it to go on?” Carmen also knows that teaching is not rewarded in her educational context, but that does not stop her from wanting to improve her teaching. She persists within the system (Figure 5, quote 9), even though she disagrees with the current method for evaluating teaching (Figure 5, quote 10).

    When Carmen’s class periods (n = 25) were observed using COPUS, 16% were didactic, 64% were interactive lecture, and 20% were student-centered (Figure 2).

    Beth the Burdened.

    Beth, who uses interactive lecture, didactic, and student-centered instruction, shows rich knowledge about students, learning, and teaching. She wears herself out as she persistently looks for ways to improve her classroom.

    For Beth’s knowledge of students, like Carmen, she aims to know what her students think and creates opportunities to do so. While Emma does not grade, because it is not professionally incentivized, Beth chooses to grade in order to see where students are struggling and to better understand the nature of students’ ideas (Figure 6, quote 1). Beth also possesses knowledge of students’ negative tendencies. She finds that students come to class unprepared, which leaves Beth frustrated, because the class activities she planned do not work as well (Figure 6, quote 2). More than all other personas, Beth becomes frustrated with this lack of effort on students’ part. Beth also seems jaded, because no matter what she tries, there will be students who resist taking responsibility for their own learning. Despite constantly putting herself out there for students, she does not know how to overcome student disengagement (Figure 6, quote 3).

    FIGURE 6.

    FIGURE 6. Beth the Burdened. This figure provides an overview of Beth’s knowledge of students, teaching values, approaches to innovations, and perceived barriers.

    Regarding Beth’s teaching values, she expresses the goal of getting her students engaged with the course. For Beth, engagement is personal. Better engagement will lead to better retention. Better retention helps to assure Beth that she is doing a good job as an instructor, which is something she very much wants to do (Figure 6, quote 4). Beth also values using peer–peer interaction. She will often ask questions during class and give students time to discuss the answers and their reasoning with one another. Beth believes students learn as they discuss and explain (Figure 6, quote 5). Finally, Beth values fostering students’ problem-solving and application skills (Figure 6, quote 6). Unlike Emma, who expects students to problem solve on their own, Beth believes she “can’t just expect [students] to do all of [their learning] on their own.” Instead, like Carmen, she provides students with time to “practice in class.”

    For approaches to innovations, Beth has been implementing a variety of evidence-based practices for years. But she still finds her students make conceptual mistakes, leaving her at a loss for what else to do to address students’ needs (Figure 2, quote). Beth likes to implement formative assessments. She states she already knows a lot about her students’ commonly held misconceptions, so she thinks mostly about the ways formative assessment can help students, not the additional insights she can gain (Figure 6, quote 7). More than other personas, Beth avails herself of teaching innovations reported in education research by looking through the literature or interacting with education researchers. She especially seeks nuanced strategies to improve student learning (Figure 6, quote 8). Finally, Beth, like Carmen, expresses a personal preference to implement interactive teaching practices (Figure 6, quote 9). Unlike Ray, who enjoys interacting with his students to better relate to them, Beth likes the interactivity, because she likes seeing students engage with her curriculum.

    Beth perceives two primary barriers in her teaching. She finds student-centered instruction to be a resource-intensive process and struggles to secure the necessary resources. For example, she views teaching assistants (TAs) to be an essential resource for scaling up active-learning pedagogies, but she cannot always find appropriate TA support (Figure 6, quote 10). Beth also struggles to balance content coverage and active learning. She feels students need a “basis of the topic” before engaging in activities about the topic, so she worries she is doing “too much lecturing and sometimes not enough interaction” in class. She is conflicted and has not “quite grasped how to do [both] all the time.”

    When Beth’s classes (n = 15) were observed using COPUS, 33% were didactic, 27% were interactive lecture, and 40% were student-centered instruction (Figure 2).

    Research Question 2. What Are the Similarities and Differences in Desired PD Outcomes among Personas?

    Using interviews, we captured instances of AACR instructors describing what they want out of their AACR group meetings and other PD experiences. We show the distribution of these code occurrences across personas in Table 3.

    TABLE 3. What personas say they want from PD: Code instances that emerged for desired PD outcomesa

    Knowledge of student misconceptionsKnowledge of how students respond to an activityA cross-course effort for tackling misconceptionsWays to improve student motivation and engagementKnowledge of the department’s learning goalsInstructional materials and sharing of teaching knowledgeAssessment ideas and items
    EmmaXXXXX
    RayXXXXXXX
    CarmenXXXXX
    BethXXXXX

    aAn “X” indicates that at least one individual in a particular persona expressed a desire for this PD outcome.

    Among the personas, only Emma and Ray express a desire to learn more about student misconceptions from PD (Table 3). In contrast, Carmen and Beth, who practice more interactive lecture compared with Emma and Ray (Figure 1), do not mention this PD desire. For Emma, learning about student misconceptions provides her with “more confidence in spending time on something” when she knows “it’s something that many of them struggle with.” Emma also expresses she would like to learn more about student misconceptions, because she feels her means of assessing student understanding in large classes by multiple choice “does not do a very good job of identifying misconceptions.”

    The next distinction we found among our personas in what they desire from PD was wanting to understand how students may respond to activities before implementing any activities in their class. Only Ray and Beth mention this PD desire (Table 3). Interestingly Ray and Beth want to know students’ reactions to activities for different reasons. Ray, who is acutely in tune to students’ perspectives, knows there can be “a lot of pushback from the students” toward in-class activities. It is comforting and “useful” to him to be “aware ahead of time” of “how the students were affected” by an activity beyond “their thought process change.” In contrast, Beth, who puts the burden of student learning on herself, expresses this desired PD outcome because she would like to be completely prepared for “what types of [student] questions I might expect” or what “type of [student] response” she is likely to encounter in running the in-class activity.

    While only Emma and Ray express wanting to learn about student misconceptions from PD, Ray, Carmen, and Beth want to work with their colleagues during PD to collaboratively tackle misconceptions (Table 3). For example, Beth wants to “discuss patterns of how [misconceptions] occur” with her teaching colleagues to try to better understand the nature of them to inform “how you kind of address some of them” and “how these different topics are sometimes explained.” Similarly, Ray, who likes to relate to his students and invest in their futures (subsection Ray the Relater), recognizes that misconceptions “are so hard to change” and feels he cannot “change many” in his semester with them. He would like PD to provide an opportunity to “be very consistent about [misconceptions],” because if his teaching colleague “is aware of it and we’re meeting as a group and she gets these students in the following course next year, then she’ll be addressing it as well.” Only Emma does not mention wanting to collaboratively tackle misconceptions.

    Finally, we found distinctions among our personas in wanting to improve student motivation and engagement. While Emma, Ray, and Carmen seek out ways to motivate and engage their students, Beth does not mention this as an outcome she explicitly desires (Table 3). Emma, who values getting students enthused about course material (subsection Emma the Expert), finds that she cannot always “keep that enthusiasm and effort” and searches for ways to develop this skill. She also recognizes she is better at engaging students at the individual level but does not “know how to do it with these large classes.” Thus, she wants “to work more individually into small groups” and improve on “how to do this in a manageable way with 500 students.” Likewise, Ray wants to find “a better mixture of ways” to engage students’ interests during class and determine “how do you reboot their interest every 10 minutes?” Unlike Emma and Ray, who seek ways of capturing students’ attention, Carmen seeks out ways to “get a broader group of the class involved in the activities and to get them to really engage in the activities.”

    We found that all personas want more direction on the teaching goals of their departments (Table 3). For example, Emma is unclear about her teaching objectives and needs specific targets for improvement:

    It’s not clear what our teaching goals are in terms of when we talk about improving a class. What do we actually mean? What do we want to actually improve? Is it the mean that we care about? Is it the median? Or is it just the top 30%? I don’t know.

    Ray similarly asks,

    What is the goal? What are the learning outcomes we want? Is there a quantity of things that we need [students] to know? Are we preparing them for the MCAT? Are we exciting them about biology? Are we ensuring that they’ve been exposed to the breadth of biology they could get from [our university]?

    Likewise, Carmen states the “more we talk to each other about it, the better consensus we’ll come to about what are the appropriate levels [of depth] that we cover these different topics.” Beth asks what “we want students coming out [of college] to be able to do?” Therefore, all personas lack clarity about what to teach and to whom and see PD as an opportunity to discuss these issues and concerns to reach a consensus and have a shared list of teaching targets.

    Further, all personas want new teaching ideas, perspectives, and activities with demonstrated success (Table 3). Different personas reveal different reasons for wanting to tap into other instructors’ teaching materials and knowledge. Carmen and Beth want “good active-learning exercises” that “have addressed [misconceptions] or used examples to try to address these concerns.” In contrast, Ray recognizes some of his teaching colleagues “don’t share my opinions” about teaching. He likes to discuss the diversity of pedagogies during PD, because it forces him to reflect on his own views and “makes me think about, ‘well okay maybe I’m not right.’” Emma recognizes that “we kind of have a lot of the same issues with our students” and sharing teaching experiences at PD “is kind of a way to commiserate but to also learn about new things” that she can apply in her class.

    All personas also want access to assessment ideas and items (Table 3) for similar reasons. They recognize that developing items to effectively assess student learning is challenging, especially in large classes and across a diversity of topics and learning objectives. Carmen, for example, exemplifies this sentiment stating, “Everyone knows what the [student] problems are” but finding “a way to come up with questions to address those issues is difficult.” Additionally, all personas are dissatisfied with the current methods of assessing student learning. For example, Emma states,

    We need to have another way of assessing how our students are learning if something is not working effectively especially in the large classes, in terms of, yeah, maybe they can answer multiple-choice questions, but I’m not convinced that that’s a good gauge of their learning, and I’d just like to explore different ways of doing that.

    In all, distinct personas seek similar and different outcomes from PD.

    DISCUSSION

    The work presented here provides the first application of persona methodology to college instructors engaged in PD for evidence-based teaching. College instructors face a mountainous challenge in learning to use progressive pedagogies, focus on core concepts over content coverage, teach scientific practices, and create inclusive classrooms. We hold the perspective that college instructors need PD to meet these expectations. According to the TCSR model, effective PD should consider instructor thinking, teaching context, and teaching practice (Gess-Newsome et al., 2003). Interactions among these factors lead to variability in the readiness of faculty to implement evidence-based practices (Ferrare and Hora, 2014; Hora, 2014). Our personas illustrate this fact. Pairs of personas (Emma/Ray and Carmen/Beth) share similar teaching practices. Yet these pairs diverge in knowledge, values, approaches to innovation, and sensitivities to institutional barriers, creating diverse instructor types. Therefore, we suggest PD should be learner centered, wherein the faculty participants are the learners. We describe here ways to use personas in learner-centered PD and build upon our data to suggest ways that learner-centered PD could assist Emma, Ray, Carmen, and Beth. We also describe lines of research that would improve the feasibility of learner-centered PD. We end the paper by considering the limitations of this study and making concluding remarks.

    Using Personas in Learner-Centered PD

    Learners are different, and any one-size-fits-all approach is likely to have significant limitations. We would expect the personas produced in our study to approach PD in different ways, so we see personas as a tool to be used by PD facilitators for reflection. Reflection can sensitize facilitators to the variety of instructors they may be working with and help them prepare to respond. We do not envision personas as a diagnostic tool, for example, to identify the Rays in your PD setting. Instead, we see personas as a way to bring awareness and attention to the diversity that exists in PD settings and to help facilitators side with their own instructors and recognize what they bring to the table. For example, reading about Beth may cue facilitators to traits among their PD participants that are reminiscent of Beth and prompt them to identify additional instructor traits they can work with in their setting.

    Another important factor to remember is that personas are fictional characters and should be treated as such. The personas are based on data, manifested in a humanistic form for ease of conveying information. As such, personas should be interpreted similarly to tables, graphs, and other data representations found in the literature. For instance, a facilitator would not identify an instructor in their PD setting as “column 6” on Table X of a study, but might use “column 6” as a point of departure for reflection or discussion. Likewise, our personas could be brought into PD settings, recalling again that personas are fictional characters meant only to convey empirical data in a humanistic way.

    Learner-Centered PD for Emma, Ray, Carmen, and Beth

    In the following sections, we demonstrate some approaches PD facilitators can take when working with instructors like Emma, Ray, Carmen, or Beth. While other personas certainly exist, the personas we report provide a starting point for facilitators to appeal to their participants in unique ways and steer them to needed resources.

    Emma.

    Emma might transition to greater use of evidence-based teaching if PD facilitators build upon her interests and help her discover the rationale for evidence-based teaching. For example, Emma likes learning objectives, because they help her organize the course, but she may not integrate learning objectives into her classroom practice. A PD facilitator could develop Emma’s thinking by helping her understand the impact of sharing learning objectives with students. Doing so increases students’ learning (Armbruster et al., 2009), motivation and engagement (Armbruster et al., 2009; Winkelmes et al., 2016; Reynolds and Kearns, 2017), metacognition (Levine et al., 2008), and self-regulated use of appropriate study strategies (Simon and Taylor, 2009). Similarly, a PD facilitator could leverage Emma’s value for exciting students through relating content to real life. This connection could be achieved by helping her learn to use evidence-based pedagogies like case-based learning (Borrego et al., 2013) or authentic data-interpretation tasks (Hoskins et al., 2007; Round and Campbell, 2013; Zagallo et al., 2016) that provide real-life problem-solving scenarios. Finally, Emma perceives that evidence-based teaching provides too much handholding. A PD facilitator could empathize with Emma that, yes, science is hard but that an opportunity gap exists within the student population, particularly for women, students of color, and students from lower socioeconomic status backgrounds (Eddy et al., 2014; Ballen et al., 2017), and that evidence-based teaching can help close the opportunity gap (Freeman et al., 2011; Eddy and Hogan, 2014; Ballen et al., 2017). If Emma has an interest in equity, these facts might help her overcome her focus on sorting and her concern about the “top” students.

    Ray.

    Ray’s teaching practice is similar to Emma’s, but he more greatly values connecting to students and gauging their feelings in real time during class. A PD facilitator can tap into Ray’s intrinsic interest in relating to students and teach him concrete ways to cognitively engage students (Chi and Wylie, 2014). Similarly, Ray values forming personal connections with students, so he may benefit from evidence-based strategies that work to create a community-type learning environment (Freeman et al., 2007; Tanner, 2013; Dewsbury and Brame, 2019). Importantly, Ray’s greatest barrier appears to be how to scale up personal connections with students in large classes. Thus, a PD facilitator could emphasize evidence-based techniques that allow Ray to personally connect with students in large classes, such as using index cards to learn students’ names and call on students by name (Tanner, 2011). In PD, Ray may be swayed by stories of instructors who have made successful transitions to greater interaction without losing the adoration of their students. Finally, Ray may need evidence-based teaching materials, practice, and mentoring with feedback to see that he can be the instructor he wants to be while also giving students time to reflect, write, and problem solve in class.

    Carmen.

    PD facilitators can help Carmen by connecting her to advanced PD opportunities. Carmen will thrive when given the chance to work with colleagues to build new lessons that engage students in scientific practice (e.g., Elliott et al., 2016; Pelletreau et al., 2018). Additionally, PD facilitators who lead change efforts at the department level, such as the departmental action teams presented by Corbo and colleagues (Corbo et al., 2016), should consider tapping Carmen as a leader. Her persistence with evidence-based teaching and willingness to tackle barriers could be critical in work that addresses the systemic structures for recognizing, rewarding, and incentivizing teaching (Reinholz and Apkarian, 2018).

    Beth.

    Finally, PD facilitators could help Beth by providing a curriculum about student interest and motivation (Lovelace and Brickman, 2013; Seidel et al., 2015; Jordt et al., 2017). Interestingly, Beth is the only persona who does not express student motivation and engagement as a PD outcome (Table 3), even though she reports a lot of frustration with students’ lack of engagement. There may be many explanations for this discrepancy. For instance, perhaps Beth does not self-identify that she needs to learn about student motivation and engagement as a PD outcome. Beth also puts such a heavy burden of learning on herself that she may have inadvertently taken a lot of the “fun” out of motivating and engaging students. Beth would benefit from learning that not all instructors feel as burdened to motivate students as she does (Tanner, 2011). Beth might learn that her problems with motivating students may have to do with how she approaches motivation with her students. Beth has tremendous leadership potential, yet she may be unlikely to lead because of burnout. PD facilitators could help Beth gain recognition and passion by going to bat for her at the departmental or institutional level. They also could advocate alongside her for TA support and other types of resources that she needs to keep fighting the good fight.

    The starting recommendations here illustrate that different approaches can be taken to help meet each instructor in his or her respective change process. We found that Emma, Ray, Carmen, and Beth themselves report wanting different PD outcomes as well. For instance, Carmen and Beth do not want to learn more about common student misconceptions, perhaps because they already know them and/or implement teaching practices that reveal them. In contrast, Emma and Ray do express wanting to learn more about common student misconceptions, demonstrating a gap in their knowledge of students that PD could help fill. How can differences in knowledge about students be handled in PD if all personas are in attendance? Perhaps facilitators could design tasks that make Carmen’s and Beth’s knowledge in that area explicit in the PD space for Emma and Ray to learn from. For example, a facilitator could intentionally pair Carmen/Beth with Emma/Ray on a task that describes student work. Another difference we found in personas’ desires for PD is whether they expressed a desire to know how students respond to student-centered activities. Only Ray and Beth mention this. Perhaps this is because Ray likes to connect to students’ perspectives and Beth takes it on herself to always be fully prepared. In PD, a facilitator could design jigsaw or group tasks that pair Ray and Beth on a task related to that topic, and concurrently pair Emma and Carmen on a task related instead to developing learning goals and objectives, because those two share an interest in that PD outcome.

    Finally, our personas discuss barriers to greater or lesser degrees and feel more or less stymied by them. Yet the barriers expressed by all personas, such as concerns about content coverage or large class sizes with insufficient teaching resources, indicate problems in departmental, institutional, and disciplinary cultures (Kezar, 2014; Corbo et al., 2016; Reinholz and Apkarian, 2018). These barriers cannot be addressed by individual instructors, nor by individual PD facilitators. Indeed, research suggests that the lackluster impact of educational reform efforts over the past 20 or so years can be attributed to attending primarily to individuals (Henderson et al., 2011). PD focused on the instructor level certainly can help instructors build knowledge and gain skill as instructors work collaboratively on teaching and learning projects and give one another feedback (Henderson et al., 2011). However, barriers will persist, and the impact of reform efforts will continue to be limited if departments, institutions, and the discipline do not also address the systemic issues that block reform, such as incentive structures that reward research but not teaching. We encourage PD facilitators to leverage their social capital to contribute to change efforts at higher levels of organization (e.g., departments). Good models for this type of PD exist (Corbo et al., 2016; Reinholz et al., 2017). We also encourage departmental, institutional, and disciplinary leaders, who may rarely work at the instructor level, to rise to the challenge of leading institutional change and to partner with and support those working at the individual level. Again, good models exist (e.g., Association of American Universities, 2017).

    Research Aimed at Learner-Centered PD

    While Emma, Ray, Carmen, and Beth provide a starting point for learner-centered PD, research is still needed to pinpoint the diversity in thinking and context that impact teaching practice among college science instructors (Gess-Newsome et al., 2003; Ferrare and Hora, 2014; Hora, 2014).

    First, learner-centered PD will accelerate with further research characterizing instructor thinking. K–12 science education research offers a rich body of literature on instructor thinking and points to its critical influence on student learning (reviewed in van Driel et al., 2014; Wilson, 2013). Yet research in higher education consists of only a few key studies (e.g., Wagner et al., 2007; Speer and Wagner, 2009; Johnson and Larsen, 2012; Ferrare and Hora, 2014; Hora, 2014; Hill, 2016; Auerbach and Andrews, 2018). Auerbach and colleagues (2018) showed that, among active-learning instructors, experts distinguished themselves by their tendency to notice teaching practices that align with research on how people learn (e.g., focusing on student thinking). Looking at our data, our two active-learning personas (Carmen/Beth) think about the classroom in different ways. Carmen constantly thinks of how she can coach her students, while Beth stews over the never-ending difficulty of helping students learn. An open question is whether these distinctions in thinking lead to variances in the impact of interactive teaching on student learning. Ferrare and Hora (2014) conducted a case analysis to show that two instructors who rely primarily on didactic pedagogies talk about student learning in markedly different ways. Likewise, Emma and Ray both rely primarily on didactic approaches. Yet Emma thinks mostly about her own expertise and the need for students to take responsibility for their own learning, while Ray thinks mostly about winning the devotion of his students and making them feel comfortable in the learning environment. We need research to determine how these different dispositions impact the sustained use of evidence-based teaching. Learner-centered teaching in the science classroom has been enhanced by our ever-expanding knowledge of students’ topic-specific difficulties. Likewise, learner-centered PD will be enhanced with growth in our understanding of instructors’ naïve and scientific ideas about motivation, cognition, assessment, and inclusivity, to name a few.

    Second, research is needed about context, particularly the social dynamics that take place in PD. If we aim to bring the successes of learner-centered classrooms into PD settings, we need to understand what elements can and cannot transfer from one context to the other. One immediate difference is that there is a clear expert–novice dynamic in classrooms, whereas in PD there is not. All instructors are considered autonomous experts in some form, including experts in teaching for those with extensive classroom experience. PD facilitators bring expertise as well, but the participating instructors may or may not value their expertise. Therefore, in a college science PD context, there are often competing experts in the room. This dynamic will influence the PD environment and how a facilitator needs to approach facilitation. For example, one open question is how power dynamics influence PD facilitator interactions. Are there differences in facilitation approaches across a postdoctoral education researcher, a biology instructor, and a PD professional from the CTL? Further, PD facilitators, who are often colleagues of the PD attendees, must maintain long-term relationships and collegiality beyond the PD setting. One fruitful next step may be to turn to fields like social psychology to better understand the impact of these social dynamics in PD settings. For instance, biology education researchers could collaborate with sociologists and look particularly into the major frameworks that characterize interpersonal relations in group settings. These include social exchange theory, which explores how individuals calculate a cost–benefit analysis in social interactions (Emerson, 1976); expectation states theory, which explores how individuals construct expectations of themselves and others through social cues such as dominant behavior or knowledge of social status (Berger et al., 1972); and the effects of social influence on one’s internal beliefs and outward behavior (Kelman, 1958). Social interactions that emerge from differences in power dynamics and in instructor needs are likely to be complex, yet characterizing them may be a fruitful next step toward defining effective learner-centered PD.

    Third, we need more research to pinpoint the diverse instructor types who participate in PD. Our study captures some of that diversity with a sample of instructors that includes tenure-track and teaching-intensive faculty who have been teaching from 6 to 31 years. All of the instructors in our sample signed up voluntarily to participate in the AACR PD program. They persisted in PD for years, motivated by their interest in connecting with colleagues to share experiences in teaching (McCourt et al., 2017). Clearly, the insights we have uncovered pertain to many PD settings. However, our sample includes only research-intensive universities and does not include any new instructors or instructors from a diverse range of institution types. What is the blend of thinking, context, and practice among instructors who teach at other types of institutions or new instructors? Persona methodology can be a key approach for tackling these questions, and answering these questions is an important step toward understanding the learners who come to PD.

    Limitations

    As described earlier, the instructors in this study comprise only one sample from the college biology instructor population within the United States. Our four personas, therefore, reflect the characteristics and context found within this sample. There are likely many other instructor personas in the U.S. population of college biology instructors that have not been captured here. Further, we constructed our personas from instructors at various stages of development as teachers. That is to say, an instructor who fit into “Ray” during our study may develop to fit into a different persona later. In marketing and branding fields, personas are frequently updated (Pruitt and Adlin, 2006). Personas, including the ones we present, are static representations that may need to be updated over time.

    CONCLUSION

    Personas are a powerful tool for transforming large data sets about people into compact, digestible representations that can be easily understood and used for action (Pruitt and Adlin, 2006). Further, personas preserve the human side of the data. They facilitate an evidence-based approach to viewing instructors as individuals who are consolidating their educational views with innovative practices. We hope this approach encourages empathy in change agents and provides a way to understand their “user base” to better meet instructors who are in the process of becoming evidence-based instructors.

    ACKNOWLEDGMENTS

    This material is based on work supported by the National Science Foundation (NSF) under grants DUE 1347733 and 1322962. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF. We thank the participants who took part in this study. We also thank the Biology Education Research Group (BERG) at the University of Georgia, who improved the quality of this work with critical feedback. We also thank the AACR PD team for their helpful feedback.

    REFERENCES

  • Aldenderfer, M., & Blashfield, R. (1984). Cluster analysis. Newberry Park, CA: Sage Publications. Google Scholar
  • Allen, G. K., Wedman, J. F., & Folk, L. C. (2001). Looking beyond the valley: A five-year case study of course innovation. Innovative Higher Education, 26(2), 103–119. Google Scholar
  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action. Retrieved June 11, 2019, from www.visionandchange.org/ Google Scholar
  • Andrews, T. C., & Lemons, P. P. (2015). It’s personal: Biology instructors prioritize personal evidence over empirical evidence in teaching decisions. CBE—Life Sciences Education, 14(1), ar7. LinkGoogle Scholar
  • Andrews, T. M., Leonard, M. J., Colgrove, C. A., & Kalinowski, S. T. (2011). Active learning not associated with student learning in a random sample of college biology courses. CBE—Life Sciences Education, 10(4), 394–405. doi: 10.1187/cbe.11-07-0061 LinkGoogle Scholar
  • Arabie, P., Hubert, L. J., & De Soete, G. (1996). Clustering and classification. Singapore: World Scientific. Google Scholar
  • Armbruster, P., Patel, M., Johnson, E., & Weiss, M. (2009). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE—Life Sciences Education, 8(3), 203–213. LinkGoogle Scholar
  • Association of American Universities. (2017). Progress toward achieving systemic change: A five-year status report on the AAU Undergraduate STEM Education Initiative. Washington, DC. Google Scholar
  • Auerbach, A. J., & Andrews, T. C. (2018). Pedagogical knowledge for active-learning instruction in large undergraduate biology courses: A large-scale qualitative investigation of instructor thinking. International Journal of STEM Education, 5(1), 19. MedlineGoogle Scholar
  • Auerbach, A. J., Higgins, M., Brickman, P., & Andrews, T. C. (2018). Teacher knowledge for active-learning instruction: Expert–novice comparison reveals differences. CBE—Life Sciences Education, 17(1), ar12. doi: 10.1187/cbe.17-07-0149 LinkGoogle Scholar
  • Avgerinou, M. D., & Andersson, C. (2007). E-moderating personas. Quarterly Review of Distance Education, 8(4), 353–364. Google Scholar
  • Baker, L. A., Chakraverty, D., Columbus, L., Feig, A. L., Jenks, W. S., Pilarz, M., … & Wesemann, J. L. (2014). Cottrell Scholars Collaborative New Faculty Workshop: Professional development for new chemistry faculty and initial assessment of its efficacy. Journal of Chemical Education, 91(11), 1874–1881. doi: 10.1021/ed500547n Google Scholar
  • Ballen, C. J., Wieman, C., Salehi, S., Searle, J. B., & Zamudio, K. R. (2017). Enhancing diversity in undergraduate science: Self-efficacy drives performance gains with active learning. CBE—Life Sciences Education, 16(4), ar56. LinkGoogle Scholar
  • Beach, A. L., & Cox, M. D. (2009). The Impact of faculty learning communities on teaching and learning. Learning Communities Journal, 1(1), 7–27. Google Scholar
  • Berger, J., Cohen, B. P., & Zelditch, M., Jr. (1972). Status characteristics and social interaction. American Sociological Review, 37(3), 241–255. Google Scholar
  • Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. (2013). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses. Journal of Engineering Education, 102(3), 394–425. Google Scholar
  • Bouwma-Gearhart, J., Sitomer, A., Fisher, K. Q., Smith, C., & Koretsky, M. (2016). Studying organizational change: rigorous attention to complex systems via a multi-theoretical research model. Paper presented at: American Society for Engineering Education Annual Conference (New Orleans, LA). Google Scholar
  • Brownell, S. E., Freeman, S., Wenderoth, M. P., & Crowe, A. J. (2014). BioCore Guide: A tool for interpreting the core concepts of Vision and Change for biology majors. CBE—Life Sciences Education, 13(2), 200–211. LinkGoogle Scholar
  • Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis. London: Sage. Google Scholar
  • Chi, M. T., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49(4), 219–243. Google Scholar
  • Cooper, A. (1999). The inmates are running the asylum. Indianapolis, IN: Macmillan. Google Scholar
  • Corbo, J. C., Reinholz, D. L., Dancy, M. H., Deetz, S., & Finkelstein, N. (2016). Framework for transforming departmental culture to support educational innovation. Physical Review Physics Education Research, 12(1), 010113. Google Scholar
  • Cox, M. D. (2001). Faculty learning communities: Change agents for transforming institutions into learning organizations. To Improve the Academy, 19(1), 69–93. Google Scholar
  • Cox, M. D. (2004). Introduction to faculty learning communities. New Directions for Teaching and Learning, 2004(97), 5–23. Google Scholar
  • DataCamp Inc. (producer). (n.d.) Clustering. In Introduction to machine learning (Chapter 5). Retrieved April 2019, from www.datacamp.com/courses/introduction-to-machine-learning-with-r#! Google Scholar
  • Denning, S. (2002). How storytelling ignites action in knowledge-era organisations. RSA Journal, 149(5501), 32–34. Google Scholar
  • Dewsbury, B., & Brame, C. J. (2019). Inclusive teaching. CBE—Life Sciences Education, 18(2), fe2. LinkGoogle Scholar
  • Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., & Jardeleza, S. E. (2011). What we say is not what we do: Effective evaluation of faculty professional development programs. BioScience, 61(7), 550–558. Google Scholar
  • Eddy, S. L., Brownell, S. E., & Wenderoth, M. P. (2014). Gender gaps in achievement and participation in multiple introductory biology classrooms. CBE—Life Sciences Education, 13(3), 478–492. LinkGoogle Scholar
  • Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: How and for whom does increasing course structure work? CBE—Life Sciences Education, 13(3), 453–468. LinkGoogle Scholar
  • Elliott, E. R., Reason, R. D., Coffman, C. R., Gangloff, E. J., Raker, J. R., Powell-Coffman, J. A., & Ogilvie, C. A. (2016). Improved student learning through a faculty learning community: How faculty collaboration transformed a large-enrollment course from lecture to student centered. CBE—Life Sciences Education, 15(2), ar22. doi: 10.1187/cbe.14-07-0112 LinkGoogle Scholar
  • Emerson, R. M. (1976). Social exchange theory. Annual Review of Sociology, 2(1), 335–362. Google Scholar
  • Estrada, M., Burnett, M., Campbell, A. G., Campbell, P. B., Denetclaw, W. F., Gutierrez, C. G., … & Zavala, M. (2016). Improving underrepresented minority student persistence in STEM. CBE—Life Sciences Education, 15(3), es5. doi: 10.1187/cbe.16-01-0038 LinkGoogle Scholar
  • Felsenstein, J. (1985). Confidence limits on phylogenies: An approach using the bootstrap. Evolution, 39(4), 783–791. MedlineGoogle Scholar
  • Ferrare, J. J., & Hora, M. T. (2014). Cultural models of teaching and learning in math and science: Exploring the intersections of culture, cognition, and pedagogical situations. Journal of Higher Education, 85(6), 792–825. Google Scholar
  • Finelli, C. J., Daly, S. R., & Richardson, K. M. (2014). Bridging the research-to-­practice gap: Designing an institutional change plan using local evidence. Journal of Engineering Education, 103(2), 331–361. Google Scholar
  • Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences USA, 111(23), 8410–8415. doi: 10.1073/pnas.1319030111 MedlineGoogle Scholar
  • Freeman, S., Haak, D., & Wenderoth, M. P. (2011). Increased course structure improves performance in introductory biology. CBE—Life Sciences Education, 10(2), 175–186. LinkGoogle Scholar
  • Freeman, T. M., Anderman, L. H., & Jensen, J. M. (2007). Sense of belonging in college freshmen at the classroom and campus levels. Journal of Experimental Education, 75(3), 203–220. Google Scholar
  • Gess-Newsome, J., Southerland, S. A., Johnston, A., & Woodbury, S. (2003). Educational reform, personal practical theories, and dissatisfaction: The anatomy of change in college science teaching. American Educational Research Journal, 40(3), 731–767. doi: 10.3102/00028312040003731 Google Scholar
  • Grudin, J., & Pruitt, J. (2002). Personas, participatory design and product development: An infrastructure for engagement. In Meeting proceedings of the Participatory Design Conference (PDC), held June 23–25, 2002, Malmo, Sweden. Google Scholar
  • Guy, B. R. (2017). Movers, shakers, & everyone in between: Faculty personas surrounding active learning in the undergraduate STEM classroom. ie: Inquiry in Education, 9(2), 6. Google Scholar
  • Ha, M., Nehm, R. H., Urban-Lurain, M., & Merrill, J. E. (2011). Applying computerized-scoring models of written biological explanations across courses and colleges: Prospects and limitations. CBE—Life Sciences Education, 10(4), 379–393. LinkGoogle Scholar
  • Haak, D. C., HilleRisLambers, J., Pitre, E., & Freeman, S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216. doi: 10.1126/science.1204820 MedlineGoogle Scholar
  • Haudek, K. C., Kaplan, J. J., Knight, J., Long, T., Merrill, J., Munn, A., … & Urban-Lurain, M. (2011). Harnessing technology to improve formative assessment of student conceptions in STEM: Forging a national network. CBE—Life Sciences Education, 10(2), 149–155. LinkGoogle Scholar
  • Haudek, K. C., Prevost, L. B., Moscarella, R. A., Merrill, J., & Urban-Lurain, M. (2012). What are they thinking? Automated analysis of student writing about acid–base chemistry in introductory biology. CBE—Life Sciences Education, 11(3), 283–293. LinkGoogle Scholar
  • Henderson, C., Beach, A. L., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. Google Scholar
  • Henderson, C., Dancy, M., & Niewiadomska-Bugaj, M. (2012). Use of research-based instructional strategies in introductory physics: Where do faculty leave the innovation-decision process? Physical Review Special Topics—Physics Education Research, 8(2), 020104. Google Scholar
  • Henderson, C., & Dancy, M. H. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics—Physics Education Research, 3(2). doi: 10.1103/PhysRevSTPER.3.020102 Google Scholar
  • Henderson, C., & Dancy, M. H. (2011). Increasing the impact and diffusion of STEM education innovations. In Center for the Advancement of Scholarship on Engineering Education Forum, New Orleans, LA. Retrieved June 11, 2018, from www.nae.edu/File.aspx Google Scholar
  • Hill, K. M. (2016). A social constructivist perspective of teacher knowledge: The PCK of biology faculty at large research institutions. In Transforming institutions: Undergraduate STEM education for the 21st century (pp. 353–369). West Lafayette, IN: Purdue University Press. Google Scholar
  • Hora, M. T. (2014). Exploring faculty beliefs about student learning and their role in instructional decision-making. Review of Higher Education, 38(1), 37–70. doi: 10.1353/rhe.2014.0047 Google Scholar
  • Hora, M. T., & Ferrare, J. J. (2014). Remeasuring postsecondary teaching: How singular categories of instruction obscure the multiple dimen-sions of classroom practice. Journal of College Science Teaching, 43, 36–41. Google Scholar
  • Hora, M. T., Oleson, A., & Ferrare, J. J. (2013). Teaching Dimensions Observation Protocol (TDOP) user’s manual. Madison: Center for Education Research, University of Wisconsin–Madison. Retrieved August 30, 2019, from http://tdop.wceruw.org/Document/TDOP-Users-Guide.pdf Google Scholar
  • Hornstein, H. A. (2017). Student evaluations of teaching are an inadequate assessment tool for evaluating faculty performance. Cogent Education, 4(1), 1304016. Google Scholar
  • Hoskins, S. G., Stevens, L. M., & Nehm, R. H. (2007). Selective use of the primary literature transforms the classroom into a virtual laboratory. Genetics, 176(3), 1381–1389. MedlineGoogle Scholar
  • Johnson, E. M., & Larsen, S. P. (2012). Teacher listening: The role of knowledge of content and students. Journal of Mathematical Behavior, 31(1), 117–129. Google Scholar
  • Jordt, H., Eddy, S. L., Brazil, R., Lau, I., Mann, C., Brownell, S. E., … & Freeman, S. (2017). Values affirmation intervention reduces achievement gap between underrepresented minority and white students in introductory biology classes. CBE—Life Sciences Education, 16(3), ar41. LinkGoogle Scholar
  • Kapur, M. (2016). Examining productive failure, productive success, unproductive failure, and unproductive success in learning. Educational Psychologist, 51(2), 289–299. doi: 10.1080/00461520.2016.1155457 Google Scholar
  • Kelly, P. (2006). What is teacher learning? A socio-cultural perspective. Oxford Review of Education, 32, 505–519 Google Scholar
  • Kelman, H. C. (1958). Compliance, identification, and internalization three processes of attitude change. Journal of Conflict Resolution, 2(1), 51–60. Google Scholar
  • Kezar, A. (2014). How colleges change: Understanding, leading, and enacting change. New York: Routledge. Google Scholar
  • Lave, J., & Wenger, E. (1991). Situated learning. Cambridge, UK: Cambridge University Press. doi: 10.1017/CBO9780511815355 Google Scholar
  • Levine, L. E., Fallahi, C. R., Nicoll-Senft, J. M., Tessier, J. T., Watson, C. L., & Wood, R. M. (2008). Creating significant learning experiences across disciplines. College Teaching, 56(4), 247–254. Google Scholar
  • Lovelace, M., & Brickman, P. (2013). Best practices for measuring students’ attitudes toward learning science. CBE—Life Sciences Education, 12(4), 606–617. LinkGoogle Scholar
  • Lund, T. J., & Stains, M. (2015). The importance of context: An exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. International Journal of STEM Education, 2(1)doi: 10.1186/s40594-015-0026-8 Google Scholar
  • Madsen, A., McKagan, S. B., Sayre, E. C., Martinuk, M., & Bell, A. (2014). Personas as a powerful methodology to design targeted professional development resources. arXiv:1408.1125v2 [physics.ed-ph]. Google Scholar
  • Manduca, C. A., Iverson, E. R., Luxenberg, M., Macdonald, R. H., McConnell, D. A., Mogk, D. W., & Tewksbury, B. J. (2017). Improving undergrad STEM education: The efficacy of discipline-based professional development. Science Advances, 3, e1600193. MedlineGoogle Scholar
  • McCourt, J. S., Andrews, T. C., Knight, J. K., Merrill, J. E., Nehm, R. H., Pelletreau, K. N., … & Lemons, P. P. (2017). What motivates biology instructors to engage and persist in teaching professional development? CBE—Life Sciences Education, 16(3), ar54. doi: 10.1187/cbe.16-08-0241 LinkGoogle Scholar
  • Moharreri, K., Ha, M., & Nehm, R. H. (2014). EvoGrader: An online formative assessment tool for automatically evaluating written evolutionary explanations. Evolution: Education Outreach, 7(1), 15. Google Scholar
  • National Academies of Sciences, Engineering, and Medicine. (2018). How people learn II: Learners, contexts, and cultures. Washington, DC: National Academies Press. Google Scholar
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academies Press. Google Scholar
  • Novick, L. R., & Catley, K. M. (2007). Understanding phylogenies in biology: The influence of a Gestalt perceptual principle. Journal of Experimental Psychology: Applied, 13(4), 197. MedlineGoogle Scholar
  • Park, S., Jang, J.-Y., Chen, Y.-C., & Jung, J. (2011). Is pedagogical content knowledge (PCK) necessary for reformed science teaching? Evidence from an empirical study. Research in Science Education, 41(2), 245–260. Google Scholar
  • Pelletreau, K. N., Knight, J. K., Lemons, P. P., McCourt, J. S., Merrill, J. E., Nehm, R. H., … & Smith, M. K. (2018). A faculty professional development model that improves student learning, encourages active-learning instructional practices, and works for faculty at multiple institutions. CBE—Life Sciences Education, 17(2), es5. LinkGoogle Scholar
  • Pfund, C., Miller, S., Brenner, K., Bruns, P., Chang, A., Ebert-May, D., …& Khan, I. M. (2009). Summer Institute to improve university science teaching. Science, 324(5926), 470–471. MedlineGoogle Scholar
  • Poh, M-Z., Swenson, N. C., & Picard, R. W. (2010). A wearable sensor for unobtrusive, long-term assessment of electrodermal activity. IEEE Transactions on Biomedical Engineering, 57(5), 1243–1252. MedlineGoogle Scholar
  • President’s Council of Advisors on Science and Technology. (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Washington, DC: U.S. Government Office of Science and Technology. Google Scholar
  • Prevost, L. B., Smith, M. K., & Knight, J. K. (2016). Using student writing and lexical analysis to reveal student thinking about the role of stop codons in the central dogma. CBE—Life Sciences Education, 15(4), ar65. LinkGoogle Scholar
  • Prosser, M., Trigwell, K., & Taylor, P. (1994). A phenomenographic study of academics’ conceptions of science learning and teaching. Learning Instruction, 4(3), 217–231. Google Scholar
  • Pruitt, J., & Adlin, T. (2006). The persona lifecycle: Keeping people in mind throughout product design (interactive technologies). San Francisco, CA: Elsevier. Google Scholar
  • Reinholz, D. L., & Apkarian, N. (2018). Four frames for systemic change in STEM departments. International Journal of STEM Education, 5(3). MedlineGoogle Scholar
  • Reinholz, D. L., Corbo, J. C., Dancy, M., & Finkelstein, N. (2017). Departmental action teams: Supporting faculty learning through departmental change. Learning Communities Journal, 9, 5–32. Google Scholar
  • Reynolds, H. L., & Kearns, K. D. (2017). A planning tool for incorporating backward design, active learning, and authentic assessment in the college classroom. College Teaching, 65(1), 17–27. Google Scholar
  • Round, J. E., & Campbell, A. M. (2013). Figure facts: Encouraging undergraduates to take a data-centered approach to reading primary literature. CBE—Life Sciences Education, 12(1), 39–46. LinkGoogle Scholar
  • Sadler, P. M., Sonnert, G., Coyle, H. P., Cook-Smith, N., & Miller, J. L. (2013). The influence of teachers’ knowledge on student learning in middle school physical science classrooms. American Educational Research Journal, 50(5), 1020–1049. Google Scholar
  • Salminen, J., Jung, S. G., An, J., Kwak, H., Nielsen, L., & Jansen, B. J. (2019). Confusion and information triggered by photos in persona profiles. International Journal of Human-Computer Studies, 129, 1–14. Google Scholar
  • Seidel, S. B., Reggi, A. L., Schinske, J. N., Burrus, L. W., & Tanner, K. D. (2015). Beyond the biology: A systematic investigation of noncontent instructor talk in an introductory biology course. CBE—Life Sciences Education, 14(4), ar43. LinkGoogle Scholar
  • Seidel, S. B., & Tanner, K. D. (2013). “What if students revolt?”—Considering student resistance: Origins, options, and opportunities for investigation. CBE—Life Sciences Education, 12(4), 586–595. LinkGoogle Scholar
  • Simon, B., & Taylor, J. (2009). What is the value of course-specific learning goals? Journal of College Science Teaching, 39(2), 52–57. Google Scholar
  • Smith, M. K., Jones, F. H., Gilbert, S. L., & Wieman, C. E. (2013). The Classroom Observation Protocol for Undergraduate STEM (COPUS): A new instrument to characterize university STEM classroom practices. CBE—Life Sciences Education, 12(4), 618–627. LinkGoogle Scholar
  • Smith, M. K., Vinson, E. L., Smith, J. A., Lewin, J. D., & Stetzer, M. R. (2014). A campus-wide study of STEM courses: New perspectives on teaching practices and perceptions. CBE—Life Sciences Education, 13(4), 624–635. LinkGoogle Scholar
  • Speer, N. M., & Wagner, J. F. (2009). Knowledge needed by a teacher to provide analytic scaffolding during undergraduate mathematics classroom discussions. Journal for Research in Mathematics Education, 40(5), 530–562. Google Scholar
  • Stains, M., Harshman, J., Barker, M. K., Chasteen, S. V., Cole, R., DeChenne-Peters, S. E., … & Laski, F. A. (2018). Anatomy of STEM teaching in North American universities. Science, 359(6383), 1468–1470. MedlineGoogle Scholar
  • Stains, M., Pilarz, M., & Chakraverty, D. (2015). Short and long-term impacts of the Cottrell Scholars Collaborative new faculty workshop. Journal of Chemical Education, 92(9), 1466–1476. Google Scholar
  • Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE—Life Sciences Education, 16(1), rm1. LinkGoogle Scholar
  • Stevens, L. M., & Hoskins, S. G. (2014). The CREATE strategy for intensive analysis of primary literature can be used effectively by newly trained faculty to produce multiple gains in diverse students. CBE—Life Sciences Education, 13(2), 224–242. LinkGoogle Scholar
  • Strauss, A., & Corbin, J. (1998). Basics of qualitative research techniques. Thousand Oaks, CA: Sage. Google Scholar
  • Tanner, K. D. (2011). Moving theory into practice: A reflection on teaching a large, introductory biology course for majors. CBE—Life Sciences Education, 10(2), 113–122. LinkGoogle Scholar
  • Tanner, K. D. (2013). Structure matters: Twenty-one teaching strategies to promote student engagement and cultivate classroom equity. CBE—Life Sciences Education, 12(3), 322–331. LinkGoogle Scholar
  • Urban-Lurain, M., Prevost, L., Haudek, K. C., Henry, E. N., Berry, M., & Merrill, J. E. (2013). Using computerized lexical analysis of student writing to support just-in-time teaching in large enrollment STEM courses. Paper presented at: 2013 IEEE Frontiers in Education Conference (FIE), Oklahoma City, OK. Google Scholar
  • van Driel, J. H., Berry, A., & Meirink, J. (2014). Research on science teacher knowledge. In Handbook of research on science education (Vol. 2, pp. 862–884). New York: Routledge. Google Scholar
  • Wagner, J. F., Speer, N. M., & Rossa, B. (2007). Beyond mathematical content knowledge: A mathematician’s knowledge needed for teaching an inquiry-oriented differential equations course. Journal of Mathematical Behavior, 26(3), 247–266. Google Scholar
  • Weston, M., Haudek, K. C., Prevost, L., Urban-Lurain, M., & Merrill, J. (2015). Examining the impact of question surface features on students’ answers to constructed-response questions on photosynthesis. CBE—Life Sciences Education, 14(2), ar19. LinkGoogle Scholar
  • Wilks, D. S. (2011). Statistical methods in the atmospheric sciences (Vol. 100). Academic Press. Google Scholar
  • Wilson, S. M. (2013). Professional development for science teachers. Science, 340(6130), 310–313. MedlineGoogle Scholar
  • Winkelmes, M.-A., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., & Weavil, K. H. (2016). A teaching intervention that increases underserved college students’ success. Peer Review, 18(1/2), 31–36. Google Scholar
  • Zagallo, P., Meddleton, S., & Bolger, M. S. (2016). Teaching real data interpretation with models (TRIM): Analysis of student dialogue in a large-enrollment cell and developmental biology course. CBE—Life Sciences Education, 15(2), ar17. LinkGoogle Scholar