ASCB logo LSE Logo

Published Online:https://doi.org/10.1187/cbe.15-11-0229

Earlier this year, I was enjoying dinner with an influential colleague at an education conference. As would be expected given the venue, this colleague was interested in good teaching. I assumed she would be familiar with the abundance of research on the effectiveness of active learning (e.g., Hake, 1998; Prince, 2004; Ruiz-Primo et al., 2011; Freeman et al., 2014). After listening to one of our tablemates describe an active-learning strategy he had used, my companion turned to me and asked, “But how do we know that it works?” Her question caught me by surprise. Hadn't this question been answered time and time again?

This moment of disconnect between research on effective instruction and the many instructors and other decision makers who are not yet familiar with this body of knowledge got me thinking about the power of the CBE—Life Sciences Education (LSE) community. Since the inception of the journal, LSE editors, authors, and readers have been called on to serve as translators between what is known about teaching and learning and how teaching is practiced. LSE has put this translational role into practice in a number of ways, such as the Approaches to Biology Teaching and Learning and Current Insights features and the Research Methods essays.

These, along with many other articles and essays published in LSE, have described or cited the guiding principles for teaching that promotes active learning: engaging students, aiming for an outcome or objective, providing structure and opportunities for practice, giving feedback, encouraging interaction and reflection, expecting higher-level thinking, informing instructional decisions with evidence of student learning and development, and incorporating well-motivated and well-timed explanations from reading or mini-lectures (Bransford et al., 1999; Singer et al., 2012; Dolan and Collins, 2015; Kober, 2015). We know this type of teaching works when deployed well (Freeman et al., 2014), and that it works especially well for students who have been traditionally underserved (e.g., Eddy and Hogan, 2014).

Imagine for a moment that teaching using active learning is a construction project, and the goal is to construct student learning. The construction tools (e.g., screwdriver, hammer) are the instructional materials (e.g., assignments, clicker questions, exams), and how the tools are used is the instructional strategy. At this point in understanding teaching and learning, we know how to design the screwdriver and the hammer. We also know how a screwdriver and a hammer should be used, and that some aspects of construction will require a screwdriver, while others will require a hammer.

A person who is new to construction may not know that the hammer, rather than the handle of the screwdriver, is a better tool to drive in a nail. He or she may not know that a particular screw requires the use of a Phillips-head instead of a flathead screwdriver. This does not mean we need to redemonstrate that a screwdriver or hammer works. Rather, we need to figure out ways to help all involved in construction to learn how useful the tools are, how to select the right tools for the job, how to use the tools, and what latitude there is for using a range of tools.

This is the direction in which we need to head with the study of biology education. We need to know what is happening during active learning that makes it work—at the levels of the student, instructor, discipline, department, and institution. We need to understand what working means, for whom, and in what contexts (Tanner, 2011). This will require a different kind of research—what some are calling the next generation of biology education research (BER), or BER 2.0.

Excitingly, the LSE community is already making progress in this direction. Several recent articles in the journal have aimed at demonstrating what makes “flipped instruction” work (Gross et al., 2015; Jensen et al., 2015) and what “working” means (van Vliet et al., 2015). To continue to make progress in this direction, we need to look to other fields for theory and methods, including cognitive science, psychology, sociology, and anthropology, while keeping in mind our important role in translating the work in these fields, so it is comprehensible to a much broader audience. We need to think creatively about how to bring life sciences research methods—such as those used to study physiological systems, to model ecological processes across scales, and to analyze metabolic networks—to bear on the study of teaching and learning. We need to examine research from such diverse environments as K–12 education and corporate settings and to envision how it might help us understand biology education at other levels and in other settings.

We have embraced concept inventories to measure student learning, which has been an important driver of deeper consideration of how we assess our students’ learning. Now we need to explore other ways of thinking about student cognition (Pellegrino et al., 2001), such as threshold concepts (Meyer, 2008; Meyer and Land, 2006; Loertscher et al., 2014); learning progressions (Alonzo and Gotwals, 2012); and schema, phenomenological primitives, and cognitive construals (diSessa, 1988, 1993; Hammer, 1996; Coley and Tanner, 2015). We need to examine other ways biology learners develop, for example, in their identities as scientists; their sense of belonging to science; or their abilities to reflect, self-regulate, and embrace a growth mind-set (Duckworth and Yeager, 2015). We need to balance our need to use common instruments to compare results across studies with our need to develop new and better ways to measure important outcomes that will help us improve the experience of learning biology (e.g., Pellegrino et al., 2001; Yeager et al., 2013).

We need to study instructional change beyond single classrooms or institutions. For example, how do faculty develop knowledge and skills important for teaching research courses (Auchincloss et al., 2014), supporting all students in learning, or guiding students in learning particular domains of life science? What lessons learned from professional development in other disciplines and K–12 settings apply to understanding experiences of biology faculty? In what ways do our institutions differ in their teaching climates, cultures, operations, and incentive systems, and how do these differences support or constrain faculty members in improving their teaching? Again, we can inform our research in these areas by exploring other fields, such as industrial and organizational psychology, improvement science, and health systems research (Campbell et al., 2000; World Health Organization, 2015).

BER 2.0—moving beyond answering the question of whether it works—will be best positioned to thrive if we continue to embrace our role in translating what is known about teaching and learning so that it can both inform our work and serve a broader audience of biology educators. This has been priority and a defining feature of LSE since its inception and will be the focus of a new phase of development of the journal in 2016. Stay tuned!

REFERENCES

  • Alonzo AC, Gotwals AW (2012). Learning Progressions in Science: Current Challenges and Future Directions, Rotterdam, Netherlands: Sense Publishers. Google Scholar
  • Auchincloss LC, Laursen SL, Branchaw JL, Eagan K, Graham M, Hanauer DI, Lawrie G, McLinn CM, Pelaez N, Rowland S, et al. (2014). Assessment of course-based undergraduate research experiences: a meeting report. CBE Life Sci Educ 13, 29-40. LinkGoogle Scholar
  • Bransford JD, Brown AL, Cocking RR (1999). How People Learn: Brain, Mind, Experience, and School, Washington, DC: National Academies Press. Google Scholar
  • Campbell M, Fitzpatrick R, Haines A, Kinmonth AL, Sandercock P, Spiegelhalter D, Tyrer P (2000). Framework for design and evaluation of complex interventions to improve health. BMJ 321, 694-696. MedlineGoogle Scholar
  • Coley JD, Tanner K (2015). Relations between intuitive biological thinking and biological misconceptions in biology majors and nonmajors. CBE Life Sci Educ 14, ar8. LinkGoogle Scholar
  • diSessa A (1988, Ed. G FormanP Pufall, Knowledge in pieces In: Constructivism in the Computer Age, Hillsdale, NJ: Erlbaum, 49-70. Google Scholar
  • diSessa AA (1993). Toward an epistemology of physics In: Cognition Instruct, 10 105-225. Google Scholar
  • Dolan EL, Collins JP (2015). We must teach more effectively: here are four ways to get started. Mol Biol Cell 26, 2151-2155. MedlineGoogle Scholar
  • Duckworth AL, Yeager DS (2015). Measurement matters assessing personal qualities other than cognitive ability for educational purposes. Educ Researcher 44, 237-251. Google Scholar
  • Eddy SL, Hogan KA (2014). Getting under the hood: how and for whom does increasing course structure work?. CBE Life Sci Educ 13, 453-468. LinkGoogle Scholar
  • Freeman S, Eddy SL, McDonough M, Smith MK, Okoroafor N, Jordt H, Wenderoth MP (2014). Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci USA 111, 8410-8415. MedlineGoogle Scholar
  • Gross D, Pietri ES, Anderson G, Moyano-Camihort K, Graham MJ (2015). Increased preclass preparation underlies student outcome improvement in the flipped classroom. CBE Life Sci Educ 14, ar36. LinkGoogle Scholar
  • Hake RR (1998). Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys 66, 64-74. Google Scholar
  • Hammer D (1996). Misconceptions or p-prims: how may alternative perspectives of cognitive structure influence instructional perceptions and intentions. J Learn Sci 5, 97-127. Google Scholar
  • Jensen JL, Kummer TA, Godoy PDdM (2015). Improvements from a flipped classroom may simply be the fruits of active learning. CBE Life Sci Educ 14, ar5. LinkGoogle Scholar
  • (2015). Reaching Students: What Research Says about Effective Instruction in Undergraduate Science and Engineering, Washington, DC: National Academies Press. Google Scholar
  • Loertscher J, Green D, Lewis JE, Lin S, Minderhout V (2014). Identification of threshold concepts for biochemistry. CBE Life Sci Educ 13, 516-528. LinkGoogle Scholar
  • Meyer J, Land R (2006). Overcoming Barriers to Student Understanding: Threshold Concepts and Troublesome Knowledge, New York: Routledge. Google Scholar
  • Meyer JHF (2008). Threshold Concepts within the Disciplines, Rotterdam, Netherlands: Sense Publishers. Google Scholar
  • Pellegrino JW, Chudowsky N, Glaser R (2001). Knowing What Students Know. The Science and Design of Educational Assessment, Washington, DC: National Academies Press. Google Scholar
  • Prince M (2004). Does active learning work? A review of the research. J Eng Educ 93, 223-231. Google Scholar
  • Ruiz-Primo MA, Briggs D, Iverson H, Talbot R, Shepard LA (2011). Impact of undergraduate science course innovations on learning. Science 331, 1269-1270. MedlineGoogle Scholar
  • Singer SR, Nielsen NR, Schweingruber HA (2012). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, DC: National Academies Press. Google Scholar
  • Tanner KD (2011). Reconsidering “what works.”. CBE Life Sci Educ 10, 329-333. LinkGoogle Scholar
  • van Vliet EA, Winnips JC, Brouwer N (2015). Flipped-class pedagogy enhances student metacognition and collaborative-learning strategies in higher education but effect does not persist. CBE Life Sci Educ 14, ar26. LinkGoogle Scholar
  • World Health Organization (2015). What Is Health Policy and Systems Research (HPSR)? www.who.int/alliance-hpsr/about/hpsr/en. Google Scholar
  • Yeager D, Bryk A, Muhich J, Hausman H, Morales L (2013). Practical Measurement, Palo Alto, CA: Carnegie Foundation for the Advancement of Teaching. Google Scholar