ASCB logo LSE Logo

Effective Practices in Undergraduate STEM Education Part 1: Examining the Evidence

    Published Online:https://doi.org/10.1187/cbe.09-06-0038

    INTRODUCTION

    Since the publication of reports in the late 1990s by the National Science Foundation (NSF; 1996), the National Research Council (NRC; 1996, 1999), and the Boyer Commission on Educating Undergraduates in the Research University (1998) on the importance of improving undergraduate education in science, technology, engineering, and mathematics (STEM), at least 13 other federal civilian departments and agencies have spent billions of dollars on more than 200 programs to realize this goal. Most of that spending has come from the NSF and the National Institutes of Health (Government Accounting Office, 2005). Many private foundations also have invested hundreds of millions of dollars in efforts to improve undergraduate STEM education. For example, since 1988 the Howard Hughes Medical Institute has awarded more than $1.5 billion in grants to improve science education at the precollege and college levels.1

    As a result of this financial support and commitment from the public and private sectors, research into and implementation of numerous and varied promising practices for teaching, learning, assessment, and institutional organization of undergraduate STEM education have been developed in recent years. These promising practices range from improvements in teaching in individual classrooms to changes in departments.2 They include increased prominence of campus and national centers for teaching excellence, professional development for faculty members (e.g., National Academies Summer Institute on Undergraduate Education in Biology,3 On the Cutting Edge: Professional Development for Geoscience Faculty4, First II5), and large outreach and dissemination efforts (e.g., Project Kaleidoscope,6 SENCER7). Virtually all of the new promising practices have focused on student-centered, inquiry-based approaches to teaching (summarized in Handelsman et al., 2007) or alternative assessments of student learning (e.g., see references in Deeds and Callen, 2006), compared with more traditional approaches to teaching that emphasize lecturing and multiple-choice or short-answer examinations. Some of these new approaches, such as Peer-Led Team Learning8 and Just in Time Teaching,9 have gained national recognition and prominence.

    Over the past decade, new practices have been implemented in vastly different grain sizes. Some have been targeted at specific classrooms, whereas others have focused on restructuring entire curricula. Still others have emphasized the role of assessment and evaluation of learning to improve teaching effectiveness (e.g., NRC, 2003a,b). Moreover, virtually all of these practices were developed independently from one another and have emphasized somewhat different goals. In addition, communications across the STEM disciplines and within their subdisciplines is often lacking.

    Thus, despite many years of effort and significant financial expenditure, surprisingly little is known about the collective impact of these approaches on the academic success of individuals and of different populations of students. For example, do more students who experience these new approaches to learning become sufficiently interested in these subject areas to want to take additional STEM courses compared with students from more traditional courses? Do these students succeed in higher-level STEM courses? Do they retain more information over longer periods and understand concepts more deeply? Are they better able to apply what they have learned in one context to others?

    At the institutional and professional levels, are faculty willing to change their teaching when presented with evidence that certain approaches to teaching are more effective than others? Data from valid and reliable assessment instruments, such as concept inventories (Hestenes et al., 1992; Mazur, 1997; Hake, 1998; Krause, 200410; Garvin-Doxas et al., 2007; Garvin-Doxas and Klymkowsky, 2008; Klymkowsky and Garvin-Doxas, 2008; Smith et al., 2008; also see http://gci.lite.msu.edu), often show that students do not understand concepts deeply; when faculty are presented with such data, are they actively reassessing their own approaches to undergraduate teaching?

    At the national level, how effective are these promising practices in changing the institutional culture of higher education toward acceptance and adoption of new approaches to undergraduate teaching, student learning, assessment of learning, and the balance of professional responsibilities of STEM faculty and within STEM departments? Given significant institutional differences in approaches and intended audiences, is enough evidence emerging to indicate that certain approaches to undergraduate teaching and learning “transcend” these differences? Can these approaches be adopted to engage the broad spectrum of undergraduate student audiences in the kinds of learning that will be required to address the large, complex problems that must be addressed in the twenty-first century?

    A NOVEL COLLABORATIVE PROJECT TO RE-ENVISION UNDERGRADUATE STEM EDUCATION BASED BOTH ON EVIDENCE AND SOCIETAL NEEDS

    With a collaborative grant from the NSF, the NRC's Board on Science Education11 and the Wisconsin Center for Education Research (WCER) at the University of Wisconsin12 have developed a complementary project to:

    1. Organize two workshops to elucidate the current state of knowledge about selected “promising practices” in undergraduate STEM education and to suggest areas for additional research or where a major synthesis of existing research is needed (NRC).

    2. Use this knowledge and evidence toward the development of a process that will attempt to engage large segments of the undergraduate STEM education community in a focus on these practices in STEM education (WCER).

    The WCER project was developed based on dual concerns. The first is that innovation in STEM has “stalled” in the sense that the evidence gathered to date about effective or promising practices has not systematically transcended individual courses or programs. In addition, the leaders of the WCER initiative have argued that more attention must be devoted to helping students understand and value the STEM disciplines not only for themselves but also for their essential roles in addressing the urgent scientific, social, and economic challenges facing the planet and its inhabitants.13

    The NRC held two workshops in June and October 2008 to examine Evidence on Selected Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education.14 WCER began to use this information in its Mobilizing STEM Education for a Sustainable Future initiative15 at the first of two meetings of “critical advisors” in January 2009. A second meeting of these advisors was held June 2009 (when this article was submitted). Because the WCER initiative is still under development, the remainder of this article focuses on the NRC's contributions to the initiative. The next “From the National Academies” article will highlight the WCER project and its future plans.

    SUMMARY OF THE NRC WORKSHOPS

    The goal of the two NRC workshops was to examine the evidence of impact for a selected number of promising practices. Each workshop helped to elucidate the state of knowledge on the selected practices. Through 22 commissioned articles in total that were prepared before the two workshops by many of the leading experts on undergraduate STEM education, and through presentations and discussions (see footnote 14 for links to all presentations), workshop participants explored areas for additional research or where a major synthesis of existing research is still needed. Together these workshops provided opportunities to examine questions about the quality of evidence available on the impact of these practices and to explore future directions and core questions facing undergraduate STEM education: What do we know, how do we know it, and how will that information serve undergraduate STEM education for the future?

    Workshop I, held on June 30, 2008, focused on several large issues:

    • Linking evidence and learning goals

    • The state of evidence in discipline-based education research

    • A survey of “promising practices” in undergraduate STEM education

    Workshop II, convened on October 13 and 14, 2008, examined several specific promising practices and also considered questions about wider implementation of these practices, through commissioned papers about the following topics:

    • Problem- and case-based teaching and learning

    • The use of assessment to guide teaching and learning: an examination of concept inventories in the science disciplines and engineering and of student misconceptions

    • Structuring of the learning environment

    • Re-envisioning and redesigning large courses in undergraduate STEM education

    • The efficacy of research experiences for undergraduates

    • Professional development for undergraduate STEM faculty

    • Systemic change in undergraduate STEM education including dissemination of promising practices

    FUTURE DIRECTIONS

    The workshops produced a series of very valuable articles, available on the Internet, for STEM faculty who are interested in rethinking their approaches to teaching and learning, senior administrators who are seeking ways to expand these approaches within and across departments, and researchers who are looking for reviews of the existing literature in a variety of topics related to undergraduate STEM education. The remainder of this article highlights papers prepared by Susan Singer (2008), Chair of the Workshop Planning Committee, which summarized the discussions and conclusions from the June workshop, and James Fairweather (2008), who was asked to review and synthesize all of the additional articles submitted for the October workshop.

    Questions about what constitutes effective evidence for the efficacy of specific programs and approaches to improving undergraduate STEM education dominated the focus of the first workshop. As Singer (2008) points out, “The legitimacy of a given form of evidence depends on the context of the question being asked. Evidence of student learning might be used to inform one's teaching, to generate a knowledge base, or convince colleagues to adopt new teaching practices. Evidence that is useful in working with a group of students may not be of sufficient rigor to contribute to a broader knowledge base.” (p. 1).

    Workshop presenters and participants agreed that STEM courses have many different learning goals and that different kinds of evidence are needed to demonstrate their efficacy in achieving these different goals. Goals include the following:

    • Mastering a few major principles/concepts well and in-depth (as distinct from procedural knowledge)

    • Long-term retention of what is learned

    • Building a mental framework that serves as a foundation for future learning

    • Developing visualization competence including the ability to critique, interpret, construct, and connect with physical systems

    • Developing the analytical skills and critical judgment needed to use scientific information to make informed decisions

    • Understanding the nature of science

    • Finding satisfaction in engaging with real-world issues that require knowledge of science (modified from Singer, 2008, pp. 2–3)

    Given the breadth of learning goals, it is not surprising that solid evidence as to which pedagogical methods are most effective is lacking for some of the promising practices. There is less evidence to discriminate effective approaches to long-term retention of information and conceptual understanding than to evaluate techniques that encourage student interest or increase overall learning of course content. As discussed during the NRC workshop by Narum (2008), the evidence of success in scaling such practices from individual instructors and courses to academic departments and institutions is mixed.

    Singer also pointed out that when STEM faculty attempt to apply research methods to examine their teaching, they often modify approaches from their own scientific research rather than applying methodologies from the social sciences. Singer contends that “[b]oth the scale and extent of research collaboration on undergraduate STEM learning needs to expand if a coherent body of evidence is to be established.” (p. 1).

    It was also clear in both workshops that although faculty rightfully demand evidence that certain approaches are more effective than others to increase student learning, producing such evidence often is not sufficient to persuade some faculty to actually rethink their uses of more traditional approaches (Henderson and Dancy, 2007). An important exception seems to be in physics, where the development and application of the Force Concept Inventory (Hestenes et al., 1992; Mazur, 1997; Hake, 1998) has led a growing number of faculty members to adopt new teaching approaches designed to address students' alternative conceptions (Mestre, 2008). Whether concept inventories in other disciplines will be equally successful in persuading faculty to re-examine their assumptions about student learning is currently unknown.

    Although many of the workshop presenters provided detailed information about the availability and strength of evidence for various approaches to teaching and learning, in his summary of those articles, Fairweather (2008) offered some additional and provocative ideas. First, he argues that there is now enough evidence to demonstrate increased student learning through inquiry-based approaches; the research community should turn now to questions for which the evidence base is not as strong. Continuing to show the positive effects of inquiry-based approaches in still more classrooms is not contributing significantly to the knowledge base on effective practices in STEM education, he contends.

    Second, Fairweather suggests that far more progress would be made in improving student learning and interest in STEM subjects if more faculty who use less-than-effective pedagogies could be convinced to restructure their practices even slightly by replacing pedagogical approaches that are less effective with some inquiry-based techniques. Committing resources toward this goal ultimately would be more effective than offering ongoing professional development opportunities for those faculty who already have made this commitment.

    The history of reform efforts is based to a large extent on the assumption that evidence of effective or exemplary projects will result in widespread change. Fairweather reiterated that evidence for effective practices is necessary but not sufficient to convince individual faculty and other academic decision makers to change practices and policies as indicated by that evidence. He argued that institutional reward systems that value research over teaching discourage faculty from responding to the evidence of effectiveness of promising practices. Thus, other approaches to addressing the issue of going to scale are appropriate. Accordingly, the WCER project is attempting to examine what might accelerate larger-scale change to improve student learning.

    Third, Fairweather's summary emphasizes that STEM education is neither monolithic nor homogeneous. One of the essential unanswered questions about effective STEM practice is what approaches to teaching, learning, and assessment transcend the disciplines (and are thus appropriate for use in almost any setting) and what approaches are more discipline-specific. Also unclear is which practices that seem to work well within a discipline can be used in multidisciplinary or interdisciplinary approaches to teaching and learning.

    Work to prepare a more detailed summary of the workshops as a NRC report will begin soon. The breadth of topics that were explored in these workshops and the richness of the articles that contributed to those events can serve as a source of useful information for any faculty member, administrator, education researcher, or policy maker who wishes to explore what is currently known about effective practices in STEM education. These workshops are also providing an important basis for the WCER initiative.

    ACKNOWLEDGMENTS

    All of the papers, PowerPoint presentations, and agendas for these two workshops can be accessed through links at www7.nationalacademies.org/bose/promising%20practices_homepage.html. The National Research Council workshops on Selected Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education were supported by a grant from the NSF (DUE-0745112). The WCER project, Mobilizing STEM Education for a Sustainable Future, was supported by a grant from the NSF (DUE-0744106).

    FOOTNOTES

    1For additional information, see www.hhmi.org/about/sci_ed/index.html.

    2For additional information, see http://nsf.gov/div/index.jsp?div=DUE; U.S. Department of Education's Fund for the Improvement of Postsecondary Education/Comprehensive Program. For additional information, see www.ed.gov/programs/fipsecomp/index.html.

    3For additional information, see www.academiessummerinstitute.org. Also see Pfund et al. (2009).

    4For additional information, see http://serc.carleton.edu/NAGTWorkshops/index.html.

    5Faculty Institutes for Reforming Science Teaching Project. For additional information, see http://darkwing.uoregon.edu/∼first/goalsof.htm.

    6For additional information, see http://pkal.org.

    7Science Education for New Civic Engagements and Responsibilities. For additional information, see http://sencer.net.

    8For additional information, see www.pltl.org.

    9For additional information, see http://jittdl.physics.iupui.edu/jitt.

    10The Journal of Chemistry Education maintains The Chemical Concepts Inventory online at http://jce.divched.org/JCEDLib/QBank/collection/CQandChP/CQs/ConceptsInventory/CCIIntro.html.

    11For additional information, see www7.nationalacademies.org/bose.

    12For additional information, see www.wcer.wisc.edu.

    13Excerpted and modified from the vision statement for WCER project as of June 6, 2009.

    14Links to the workshop agendas, members of the planning committee, and all of the commissioned papers that were prepared for the workshop are available at www7.nationalacademies.org/bose/Promising%20Practices_Homepage.html. Links to PowerPoint files that were presented during the workshops are available within each workshop agenda.

    15For additional information, see http://mobilizingstem.wceruw.org.

    REFERENCES

  • Boyer Commission on Educating Undergraduates in the Research University (1998). Reinventing Undergraduate Education: A Blueprint for America's Research Universities In: accessed 27 May 2009 Menlo Park, CA: Carnegie Foundation for the Advancement of Teaching, http://naples.cc.sunysb.edu/Pres/boyer.nsf/. Google Scholar
  • Deeds D.Callen B. (2006). Proceedings of a National STEM Assessment Conference. Jointly Sponsored by Drury University and the National Science Foundation Springfield, MO Drury University. Google Scholar
  • Fairweather J. (2008). Linking Evidence and Promising Practices in Science, Technology, Engineering, and Mathematics (STEM) Education: A Status Report accessed 6 June 2009 Commissioned paper presented at NRC workshop on Evidence on Selected Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education Washington, DC http://nrc51/xpedio/groups/dbasse/documents/webpage/046555∼1.pdf. Google Scholar
  • Garvin-Doxas K., Klymkowsky M., Elrod S. (2007). Building, Using, and Maximizing the Impact of Concept Inventories in the Biological Sciences: Report on a National Science Foundation-sponsored Conference on the Construction of Concept Inventories in the Biological Sciences CBE Life Sci Educ accessed 31 May 2009 6 277-282 http://www.lifescied.org/cgi/content/full/6/4/277?maxtoshow=&HITS=10&hits=10&RESULTFORMAT=&author1=klymkowsky&andorexactfulltext=and&searchid=1&FIRSTINDEX=0&sortspec=relevance&resourcetype=HWCIT. Google Scholar
  • Garvin-Doxas K., Klymkowsky M. W. (2008). Understanding randomness and its impact on student learning: lessons from the biology concept inventory (BCI) CBE Life Sci. Educ accessed 31 May 2009 7 227-233 http://www.lifescied.org/cgi/content/full/7/2/227?maxtoshow=&HITS=10&hits=10&RESULTFORMAT=&author1=klymkowsky&andorexactfulltext=and&searchid=1&FIRSTINDEX=0&sortspec=relevance&resourcetype=HWCIT. Google Scholar
  • Government Accounting Office (2005). Higher Education: Federal Science, Technology, Engineering, and Mathematics Programs and Related Trends. (GAO-06-114) In: accessed 27 May 2009 Washington, DC: Government Printing Office, www.gao.gov/new.items/d06114.pdf. Google Scholar
  • Hake R. R. (1998). Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am. J. Phys 66, 64-74. Google Scholar
  • Handelsman J, Miller S., Pfund C. (2007). Scientific Teaching, New York: W.H. Freeman. Google Scholar
  • Henderson C., Dancy M. H. (2007). Barriers to the use of research-based instructional strategies: the influence of both individual and situational characteristics. physical review special topics Phys. Educ. Res accessed 8 June 2009 3 2 020102. http://prst-per.aps.org/pdf/PRSTPER/v3/i2/e020102. Google Scholar
  • Hestenes D., Wells M., Swackhamer G. (1992). Force concept inventory. Phys. Teach 30, 141-166. Google Scholar
  • Klymkowsky M. W., Garvin-Doxas K. (2008). Recognizing student misconceptions through Ed's tools and the biology concept inventory PLOS Biol accessed 31 May 2009 6 1371 www.plosbiology.org/article/info:doi/10.1371/journal.pbio.0060003. Google Scholar
  • Krause S, Birk J., Bauer R., Jenkins B., Pavelich M. J. (2004). Development, Testing, and Application of a Chemistry Concept Inventory accessed 1 June 2009 Presentation at the 34th ASEE/IEEE Frontiers in Education Conference http://fie-conference.org/fie2004/papers/1213.pdf. Google Scholar
  • Mazur E. (1997). Peer Instruction: A User's Manual, Upper Saddle River, NJ: Prentice Hall Series in Educational Innovation. Google Scholar
  • Mestre J. (2008). Learning Goals in Undergraduate STEM Education and Evidence for Achieving Them accessed 8 June 2009 Commissioned paper presented at NRC workshop on Evidence on Selected Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education Washington, DC www7.nationalacademies.org/bose/PP_Mestre_STEM%20White%20Paper.pdf. Google Scholar
  • Narum J. (2008). Promising Practices in Undergraduate STEM Education accessed 7 June 2009 Commissioned paper presented at NRC workshop on Evidence on Selected Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education Washington, DC www7.nationalacademies.org/bose/PP_Narum_WhitePaper.html. Google Scholar
  • National Research Council (NRC) (1996). From Analysis to Action: Undergraduate Education in Science, Mathematics, Engineering and Technology In: accessed 31 May 2009 Washington, DC: National Academies Press, http://books.nap.edu/catalog.php?record_id=9128. Google Scholar
  • NRC (1999). Transforming Undergraduate Education in Science, Mathematics, Engineering, and Technology In: accessed 31 May 2009 Washington, DC: National Academies Press, http://nap.edu/catalog.php?record_id=6453. Google Scholar
  • NRC (2003a). Evaluating and Improving Undergraduate Teaching in Science, Technology, Engineering, and Mathematics In: accessed 1 June 2009 Washington, DC: National Academies Press, www.nap.edu/catalog.php?record_id=10024. Google Scholar
  • NRC (2003b). Improving Undergraduate Instruction in Science, Technology, Engineering, and Mathematics: Report of a Workshop In: accessed 1 June 2009 Washington, DC: National Academies Press, www.nap.edu/catalog.php?record_id=10711. Google Scholar
  • National Science Foundation (1996). Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering, and Technology (NSF 96–139) In: accessed 27 May 2009 Arlington, VA www.nsf.gov/pubs/1998/nsf98128/contents.pdf. Google Scholar
  • Pfund C , et al. (2009). Summer institute to improve biology education at research universities Science accessed 31 May 2009 324 470-471 www.sciencemag.org/cgi/content/summary/324/5926/470. Google Scholar
  • Singer S. (2008). Linking Evidence and Learning Goals accessed 6 June 2009 Commissioned paper presented at NRC workshop on Evidence on Selected Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education Washington, DC National Academies Press http://nrc51/xpedio/groups/dbasse/documents/webpage/046681∼1.pdf. Google Scholar
  • Smith M. K., Wood W. B., Knight J. K. (2008). The genetics concept assessment: a new concept inventory for gauging student understanding of genetics CBE Life Sci. Educ accessed 31 May 2009 7 422-430 www.lifescied.org/cgi/content/full/7/4/422?maxtoshow=&HITS=10&hits=10&RESULTFORMAT=&fulltext=concept+inventory&andorexactfulltext=and&searchid=1&FIRSTINDEX=0&sortspec=relevance&resourcetype=HWCIT. Google Scholar