ASCB logo LSE Logo

Grappling with the Literature of Education Research and Practice

    Published Online:https://doi.org/10.1187/cbe.07-09-0087

    Abstract

    The absence of a central database and use of specialized language hinder nonexperts in becoming familiar with the science teaching and learning literature and using it to inform their work. The challenge of locating articles related to a specific question or problem, coupled with the difficulty of comprehending findings based on a variety of different perspectives and practices, can be prohibitively difficult. As I have transitioned from bench to classroom-based research, I have become familiar with how to locate, decipher, and evaluate the education research literature. In this essay, I point out analogies to the literature of science research and practice, and I reference some of the literature that I have found useful in becoming an education researcher. I also introduce a new regular feature, “Current Insights: Recent Research in Science Teaching and Learning,” which is designed to point CBE—Life Sciences Education (CBE-LSE) readers to current articles of interest in life sciences education, as well as more general and noteworthy publications in education research.

    INTRODUCTION

    Ideally, the breadth of education research informs teaching and learning research in the life sciences. However, the body of theoretical and practical research in education is sprawling, making up >20,000 articles published each year in >1100 journals (Mosteller et al., 2004). The absence of any unified, systematic mechanism for cataloging or accessing this information makes it nearly impossible for scientists to keep abreast of the literature on science teaching and learning, much less use it to inform their own work. When the challenge of locating articles of interest is coupled with the difficulty of comprehending the findings of an entirely different discipline, with epistemologies, cultures, and practices distinct from those of the science community (Feuer et al., 2002; Shavelson and Towne, 2002), many scientists throw up their hands in frustration. As I have transitioned from bench to classroom-based research, I have had both the necessity and the luxury, and sometimes the aggravation, of developing a working knowledge of theoretical and practical research in education.

    As I have shared references with my scientist colleagues, I have witnessed several beneficial outcomes. Learning about others' teaching and outreach efforts from practitioner journals has helped us develop a more comprehensive idea of the needs, interests, and priorities of our colleagues in education and avoid “reinventing the wheel” in education programming (Dolan et al., 2004). Reviewing the research literature has honed our thinking about how to document the effects of educational interventions on teaching and learning. I have expanded my vocabulary such that I can have more informed discussions with the evaluators of our precollege outreach and partnership work. From a broader perspective, more departments of science are hiring faculty with education expertise (Bush et al., 2006). Scientists familiar with this scholarship may be better prepared to make informed decisions about the promotion and tenure of their education colleagues, and they may also learn that they themselves benefit from participating in pedagogical endeavors (Bower, 1996; Schultz, 1996; Tanner, 2000; McKeown, 2003; Spillane, 2004; Busch and Tanner, 2006).

    My intention is not to encourage scientists to become educational researchers, but rather to better position them to benefit as teachers from the education literature. Even education researchers themselves have noted that much of the literature is written for academics in the discipline, rather than a broader audience of researchers, practitioners, and policymakers (Davis, 2007). Assumptions are often delineated using foreign concepts such as “theoretical framework.” Methodology is a combination of research method and epistemologies of learning. Protocols are described using unfamiliar terminology such as “differentiated instruction.” The data may take unfamiliar forms, such as quotes from focus groups, transcripts of interviews, or videotapes of classrooms. As I have learned to locate, decipher, and evaluate the literature, with significant guidance from colleagues and mentors, I have used analogies to science research and practice to clarify my thinking. Although these analogies have limitations, I have found them to be useful steppingstones in better understanding this body of knowledge, and I share several of them here. Thus, the intent of this essay, and one purpose of the new CBE-LSE feature Current Insights: Recent Research in Science Teaching and Learning, is to serve as a bridge for individuals with scientific expertise to enter the land of education scholarship, and to provide tools that may be useful on the journey.

    TOOLS FOR ACCESSING THE LITERATURE

    Several online databases provide access to education research citations. For example, the Education Resources Information Center (ERIC; http://www.eric.ed.gov) is an Internet-based digital library of education research and information that provides bibliographic records of journal articles and other education-related materials with sponsorship from the U.S. Department of Education. Print versions of ERIC information are published monthly in two formats: Resources in Education and Current Index to Journals in Education. Two other databases, Educational Research Abstracts Online (published by Routledge of Taylor & Francis Group; http://www.informaworld.com/smpp/title∼content=t713417651) and Education Research Complete (published by EBSCO, Ipswich, MA; http://www.epnet.com/thisTopic.php?marketID = 4&topicID = 639) provide abstracts for thousands of journals, books, and monographs, as well as full text for many journals and education-related conference papers. Finally, Google Scholar (http://scholar.google.com) enables searches of scholarly literature, including peer-reviewed papers, theses, books, preprints, abstracts, and technical reports. Google Scholar uses robotic spider software to crawl links to all scholarly articles publicly available on the World Wide Web. The company has standing agreements with academic publishers, professional societies, preprint repositories, and universities, which have helped maximize the “findability” of relevant education scholarship.

    Although these and other indexing and abstract services provide points of access, the information provided may not be sufficiently detailed regarding a study's purpose, setting, participants, research design, or other aspects that would help a nonexpert reader evaluate its relevance to his or her interests. In addition, researchers, policymakers, and even parents are increasingly demanding a rapid way to access concise information about educational outcomes to use “scientifically based research” as the grounds for “evidence-based practice” (Feuer et al., 2002; Shavelson and Towne, 2002; Slavin, 2002; St. Pierre, 2006). These demands have spurred a grassroots effort within the education research community to make the research process, including assumptions, qualifiers, and limitations, more transparent by accompanying manuscripts with a “structured abstract.”

    First proposed by Mosteller et al. (2004), the structured abstract is designed to make clearer and more accessible a study's salient features so that practitioners and decision makers can more easily locate studies of interest and assess their implications for teaching practice. Kelly and Yin (2007) propose that structured abstracts be used to “make the argumentative structure of education research articles more apparent and open to scrutiny.” Advocates contend that authors should make explicit the nature of their evidence and claims (e.g., descriptive, correlative, causal), circumstances that may affect the strength of their claims (e.g., study setting, size, context), and other qualifiers that might influence the applicability of their claims to teaching practice. ERIC now requests that authors submit a structured abstract with their contributed materials. As structured abstracts become more commonplace, they will likely result in greater accessibility for researchers and decision makers outside the education community.

    Scientists at academic institutions that lack education departments face the additional challenge that their libraries generally do not maintain subscriptions to education journals. Thus, open-access publications, which are freely available for reading and reproduction, have special appeal. Several such journals, including CBE-LSE, were initiated in electronic form to support open-access scholarship (Table 1). Other publishers give authors the freedom to choose whether their publications will be freely available, for example, Springer's Open Choice (http://www.springer.com/dal/home/open+choice) and iOpenAccess from Taylor & Francis (http://www.tandf.co.uk/journals/iopenaccess.asp). Unfortunately, not all of their journals participate in these services, and authors may have to pay substantial fees for providing open access to their work. The service Open J-Gate (a contribution of Informatics [India]; http://openj-gate.org) was launched in 2006 to serve as a portal to the open-access literature, annotating all indexed articles with respect to their “peer-review” status.

    Table 1. Journals of education research and practice

    JournalPublisherMissionTypical contentOpen accessWebsite
    Life science eduction
        Advances in Physiology Education, 2001–PresentAmerican Physiological SocietyLearning of physiology, physiology education as a scholarly endeavorPractice, applied, essaysYeshttp://advan.physiology.org/
        American Biology Teacher, 1938–PresentNational Association of Biology TeachersHow-to suggestions regarding biology teaching and learning, including current advances in life science and their social and ethical implicationsPractice, applied, essaysSomehttp://www.nabt.org/sites/S1/index.php?p=2/
        Biochemistry and Molecular Biology Education, 2006–PresentInternational Union of Biochemistry and Molecular Biology; Wiley InterScienceTeacher preparation and student learning in biochemistry, molecular biology, and related sciencesPractice, applied theoreticalNohttp://www3.interscience.wiley.com/cgi-bin/jhome/112782101/
        Bioscene: Journal of College Biology Teaching, 1975–PresentAssociation of College & University Biology EducatorsTeaching biology at the college levelPractice, Applied, EssaysYeshttp://acube.org/publications.html
        BioScience, 1964–PresentAmerican Institute of Biological SciencesCurrent research in biology and essays on education, public policy, history, and the nature of biological sciencesPractice, Applied, EssaysSomehttp://www.aibs.org/bioscience/
        CBE—Life Sciences Education, 2002–PresentAmerican Society for Cell BiologyLife science education and teaching and learning in related disciplines from kindergarten through graduate schoolApplied, theoretical, essaysYeshttp://www.lifescied.org/
        Journal of Biological Education, 1966–PresentInstitute of Biology (British)Applied life science education research related to curriculum and policyPractice, appliedSomehttp://www.iob.org/general.asp?section=publications/jbe
        Journal of Microbiology and Biology Education, 2000–PresentAmerican Society of MicrobiologyMicrobiology and biology teaching and learning, especially regarding student learning and other outcomesApplied, theoreticalNohttp://www.microbelibrary.org/submit/index.asp?bid=293/
    Science education
        Cultural Studies of Science Education, 2006–PresentSpringer NetherlandsScience education as a cultural, cross-age, cross-class, and cross-disciplinary phenomenon, accompanied by interactive dialogueTheoreticalSomehttp://www.springerlink.com/content/120017/
        Electronic Journal of Science EducationSouthwestern UniversityScience education and science teacher education in schools, colleges, and universitiesApplied, theoreticalYeshttp://ejse.southwestern.edu/
        International Journal of Science EducationRoutledge, Taylor & Francis GroupApplicable research related to science education practice in schools, colleges, and universitiesApplied, theoreticalNohttp://www.tandf.co.uk/journals/titles/09500693.asp
        Journal of College Science Teaching, 1971–PresentNational Science Teachers AssociationCurriculum and instruction in undergraduate and graduate science educationPractice, appliedNohttp://www.nsta.org/college/
        Journal of Elementary Science Education, 1989–PresentAssociation for Science Teacher Education; College of Education and Human Services at Western Illinois UniversitySupervision, curriculum, and instruction in elementary science educationApplied, theoreticalSomehttp://www.wiu.edu/users/jese/index.html/
        Journal of Research in Science TeachingNational Association of Research in Science Teaching; Wiley InterScienceIssues related to science teaching and learning and science education policyApplied, theoreticalNohttp://journals.wiley.com/0022–4308/
        Journal of Science Education and Technology, 1992–PresentSpringer NetherlandsScience education across disciplines from the individual to the system levelApplied, theoreticalSomehttp://www.springer.com/10956
        Journal of Science Teacher Education, 1990–PresentAssociation for Science Teacher EducationPreparation and in-service education of science teachersApplied, theoreticalSomehttp://theaste.org/
        Journal of STEM Education: Innovations and Research, 2000–PresentLaboratory of Innovative Technology and Engineering Education, Auburn UniversityCase studies and articles about innovations and research in science, technology, engineering, and mathematics educationPractice, appliedYeshttp://www.auburn.edu/research/litee/jstem/
        Research in Science Education, 1970–PresentSpringer NetherlandsScience education in early childhood, school, college, university, workplace, and informal learning contextsApplied, theoreticalSomehttp://www.springerlink.com/content/108230/asp?id=108230/
        School Science and Mathematics, 1901–PresentTexas A&M UniversityIssues, concerns, and lessons within and between classroom science and mathematicsPractice, applied, theoreticalSomehttp://ssmj.tamu.edu/
        Science and Children, 1963–PresentNational Science Teachers AssociationCurriculum and instruction in elementary science educationPractice, appliedNohttp://www.nsta.org/elementaryschool/
        Science Education, 1916–PresentWiley InterscienceIssues and trends occurring internationally in science curriculum, instruction, learning, policy, and teacher preparationApplied, theoreticalNohttp://www3.interscience.wiley.com/cgi-bin/jhome/32122
        Science Scope, 1977–PresentNational Science Teachers AssociationCurriculum and instruction in middle school science educationPractice, appliedNohttp://www.nsta.org/middleschool/
        The Science Teacher, 1932–PresentNational Science Teachers AssociationCurriculum and instruction in high school science educationPractice, appliedNohttp://www.nsta.org/highschool/

    This table includes information about journals in life science education and science education in general. Information is included about the journal's title, publisher, typical content, and website URL, as well as whether the journal has any or all open source content. Typical content includes articles related to teaching practice (i.e., how-to), applied and theoretical research on teaching and learning, and essays (e.g., commentaries on current issues, public policy).

    As in any field, education journals tend to specialize with regard to research questions and methodologies. Some journals feature descriptive essays, others theoretical research, some ethnographic studies, others statistical analyses, and so on. Articles describing teaching strategies and curricular innovations, described as “practice” publications in Table 1, are usually found in journals tailored to a specific teaching and learning audience (e.g., precollege, undergraduate, graduate, science center/museum). Articles in which theories are tested or developed, including those intended to demonstrate relationships between instructional approaches and learning outcomes or to understand the cognitive, social, and cultural underpinnings of teaching and learning, are generally found in journals of applied and theoretical research. Regardless of the question of interest, journals can be identified that have a mission to feature this kind of work or to reach an audience with similar interests (Table 1).

    Journals that feature the following kinds of work were not included in Table 1, but they may be of interest: general education, graduate and professional education, education administration and leadership, teaching and learning in other science disciplines (e.g., geoscience, chemistry, physics), informal and nonformal education (i.e., respectively, learning in unstructured settings such as science museums and learning in more structured but not classroom-based settings such as 4-H), evaluation, technological and applied science education (e.g., agricultural education), and educational psychology. In addition, several scientific journals have forums for publishing education articles (e.g., front matter in The Plant Cell, “Genetics Education” in Genetics, “Education Forum” in Science).

    TOOLS FOR INTERPRETING THE LITERATURE

    Even if an article of interest is successfully located, it may not be decipherable by nonexperts. Life scientists are familiar with the old adage that learning biology, which is replete with discipline-specific terminology, resembles learning a foreign language. Similarly, grappling with the education literature requires learning the vocabulary of both practice and scholarship. Several tools have been developed to assist nonexperts in learning the jargon, including online and hard copy dictionaries and glossaries. The Lingo of Learning: 88 Terms Every Science Teacher Should Know (Colburn, 2003) and EdSpeak: A Glossary of Education Terms, Phrases, Buzzwords, and Jargon (Ravitch, 2007) define education practice terms and research vocabulary most relevant to schools and classrooms. EdSpeak also includes a handy list of acronyms for phrases, policymaking groups, funding agencies, accrediting bodies, and other relevant organizations. Other resources useful for interpreting evaluative, applied, and theoretical research include Fourth Generation Evaluation (Guba and Lincoln, 1989), Qualitative Research & Evaluation Methods (Patton, 2002) and Research Design: Qualitative, Quantitative, and Mixed Method Approaches (Creswell, 2003). Resources from psychology, sociology, and anthropology can also be useful (e.g., Hammond and Brandt, 2004). For example, the Online Dictionary of the Social Sciences (http://bitbucket.icaap.org/; Drislane and Parkinson, 2007), hosted by Canada's Athabasca University, has 1000 entries covering sociology and related disciplines. Explorations in Learning & Instruction: The Theory Into Practice Database (http://tip.psychology.org/; Kearsley, 1994–2007) includes descriptions of >50 theories relevant to human learning and instruction.

    Some life scientists may be well prepared to understand the complexities of the anthropological, psychological, and sociological underpinnings of teaching and learning. Ecologists and evolutionary biologists often study phenomena in which they are unable to control, predict, or even characterize all the variables involved. Similarly, education researchers are often not able to control all the factors at play in a learning situation, and they may not want to do so. Rather, some of their most valuable findings emerge from investigating the real contexts in which learning may occur.

    Like all research, investigations of teaching and learning begin with a question. Research questions generally fall into three categories (Shavelson and Towne, 2002, pp. 99–101): description (What is happening?), causation (When and with whom? Is it happening in a systematic or generalizable way?), and mechanism (Why is it happening?). In addition, learning behavior can be examined at different depths and with different time frames in mind, including changes in skills, knowledge, attitudes, or interests (short term), behavior and decisions (middle term), and life condition, status, or values (long term). Choices regarding methods of data collection and analysis are influenced by the outcomes that are of interest to the researcher, as illustrated in this fictional example:

    A researcher is interested in determining if and how high school students understand the dynamic interplay between gene expression and environmental stimuli. The researcher chooses to investigate this phenomenon in a class that is taught by a teacher who has a good understanding of the relevant concepts in genetics, physiology, and ecology and in a school that is geographically convenient, enabling multiple visits to the classroom. During the several weeks that students learn about these concepts, the researcher engages in substantive conversations with a few high school students within that class (documented by audiotape), observes relevant class-wide discussions (documented by videotape and/or a classroom observation protocol), collects student work, and interviews the teacher several times. The researcher and members of her research team analyze and interpret the entirety of the data to develop a rich picture of students' thinking. Before publishing her findings, the researcher shares the interpretations with the teacher to see if he thinks they have captured what the students understand.

    In this case, the researcher intended to document the learning of a limited group of students whose teacher may be well positioned to help them, rather than draw conclusions about how all students learn genetics or what students in general learn by using this curriculum. Her initial research question guided her choices regarding data collection, analyses, and interpretation, as well as the scope of her conclusions. Because she collected data by using several approaches, including discussions with students, she was able to ask them questions that made clear their understanding or lack thereof. Because she collected data over time rather than at just one or two time points (e.g., the beginning and end of the relevant units), she was able to develop hypotheses about what classroom occurrences may have altered students' conceptions. Finally, because she sought feedback from the teacher, who has a greater depth and breadth of experience working with these students, she has enhanced the credibility and trustworthiness of her interpretations.

    A researcher's perspective and theoretical framework also guide how and why he or she conducts studies. A life scientist's styles of reasoning and experimental practice (e.g., taking a biochemical or genetic approach to studying the cell cycle) are usually obvious from a quick reading of the methods in a paper or from knowledge about the journal where the work was published (e.g., Journal of Biological Chemistry vs. Genetics). As an instructive example, Bill Sullivan, a geneticist, and Doug Kellogg, a biochemist, both at University of California at Santa Cruz, have authored complementary stories illustrating how their perspectives differ (http://review.ucsc.edu/spring04/twoversions.html; Stephens, 2004). Approaching investigations from a genetic versus biochemical perspective influences the questions that are asked, the experimental tools that are used, the data that are collected, the analytical methods that are used, and the conclusions that are drawn, as well as the hypotheses and subsequent questions that are generated.

    Similarly, understanding the theoretical framework that guides an educational study can help readers identify the perspective of the researchers and anticipate the types of questions, data, analyses, and findings that will be included (Bodner and Orgill, 2007). For example, cognitive load theory rests on the premise that learning happens best in ways that are aligned with the organization of the brain and the nature of cognition, as understood from cognitive psychology and neuroscience research (Sweller, 1988; also see http://tip.psychology.org/sweller.html). For example, the cognitive load of learners depends on their experience and expertise, which influences their short-term, long-term, and working memory capacities. Experts' knowledge is organized into schemas that facilitate learning, lowering the cognitive load required for learning and enabling them to process information with greater efficiency (Bransford et al., 1999). Novices have not developed such schemas; thus, they are more limited in the amount of information they can take in and incorporate using working memory. A study framed by cognitive load theory might consider how learning materials could be designed to minimize the amount of information provided to novice learners during the learning process, or to teach novices explicitly about expert schemas to help them organize their thinking during learning.

    TOOLS FOR EVALUATING THE LITERATURE

    The intent of most education research is to understand or explain social and psychological phenomena related to learning (Firestone, 1987). Researchers have developed models that are cognitive, behavioral, and social, with respective focuses on reasoning and memory, doing and action, and interacting and culture. Regardless of which model is used, research entails subjecting data, in whatever form, to systematic analysis. Yet, human thinking and relationships are extraordinarily complex phenomena that are not straightforward to analyze systematically. How such analysis manifests is again dependent on the research question (Anfara et al., 2002). In general, data should be sufficient, credible, and accurate to be considered evidence in support of a claim (Toulmin, 2003; Kelly and Yin, 2007). In other words, the data should be adequate to support the claim, and there should be a reasonable relationship between the data and the claims they support. The methods for data collection and analysis should be appropriate for supporting the claims, and so on. These points may seem obvious, but, when considering the literature as a whole, the data and methods can be strikingly varied depending on the questions being asked. Thus, a common rubric for evaluating the quality of studies in education must be considered at this broad level, rather than based on any particular methodological approach or type of data (Grossman and Mackenzie, 2005).

    In evaluating the methodology of an education study, the reader must take into account what research questions are addressed (Ercikan and Roth, 2006). For example, randomized controlled trials or investigations with well-matched comparison groups are well suited to investigating causal relationships between interventions and outcomes. Yet, in many cases, these study designs are not feasible (i.e., it is unrealistic to randomly assign students to classes) and they are costly (Olson, 2004; Grossman and Mackenzie, 2005). For experimental or quasi-experimental findings to have value, the instrument used for data collection (e.g., an exam, survey, or questionnaire) must be valid (i.e., it actually measures what it is purported to measure in the participating population) and reliable (i.e., the instrument would yield the same responses from the same individual if it is administered at different times). High-quality instruments must be informed by current theory and knowledge about teaching and learning (e.g., what are students misconceptions about cellular respiration and how can they be identified with the instrument?), and they must be validated by pilot testing within the population of interest and conducting appropriate statistical analyses (e.g., confirmatory factor analysis; see Aikenhead and Ryan (1992) as an example).

    Insight gleaned about causal relationships between teaching strategies or curricular innovations and student and teacher outcomes may be applicable only to those individuals in that setting at that point in time. For such findings to be generalizable, credible evidence must be collected to demonstrate their applicability across populations and settings. If claims are being made about the transferability of findings to other students or teachers, the individuals in the new setting must resemble in some way the individuals in the original setting of the study. For example, findings from investigations in urban schools may not be applicable for rural schools, because urban schools have larger immigrant populations and more English language learners.

    Qualitative approaches provide opportunities to capture unintended outcomes, understand why certain outcomes occurred, and gain a deeper understanding of a phenomenon (Denzin and Lincoln, 2005, p. 5). Such research is intended to describe an experience and infer patterns about it or consider how it is representative of a broader set of experiences (Ercikan and Roth, 2006). Qualitative data tell a story by capturing and communicating someone else's experience, taking into account the perspectives, time, and situation of individuals involved, including the participants and even the researcher. The results can illuminate the actuality of teaching and learning in the real time and setting of a classroom (e.g., what is actually happening in this teaching and learning situation?). In addition, qualitative findings can serve as a proof of principle (e.g., is it possible to teach and learn in this way or using this curriculum?), a basis for generating new hypotheses (e.g., if these students learn in this way, do other students in other settings at other times learn in this same way or in other ways?), and a way to discover unanticipated outcomes (e.g., students did not seem to gain knowledge about cellular respiration, but they did expand their understanding of how scientific knowledge is generated).

    Scientists also use qualitative approaches and evidence in research (e.g., photographs to illustrate differences among cells or organisms, rich descriptions to explain the identification of a new species) and in training. A less obvious example is the oral preliminary exam that is the rite of passage to degree candidacy for all scientists-in-training. These exams are designed to ensure that the student is prepared to pursue an original line of inquiry, for example, by demonstrating awareness and understanding of relevant literature and methods, as well as some ability to interpret data, develop hypotheses, and design experiments to test them and rule out alternative explanations. Some aspects of exam content and structure are generalizable across the doctoral student population (e.g., all exams involve questioning by a group of faculty, all exams have a “grade” or outcome for the student). Yet, each exam is unique to the student, the student's research interests and completed course work, and the panel of faculty examiners. Faculty may start with certain questions in mind but may develop new questions or alter the direction of their questioning as the student articulates his or her understanding.

    The trouble with generalizations is that they don't apply to particulars.

    (Lincoln and Guba, 1985)

    The goal of the preliminary exam is to investigate in-depth the quality of one student's thinking by speaking with the individual and considering the context, not to generalize to other students. Yet, I expect that all students and faculty involved in preliminary exams intend for such experiences to yield trustworthy, dependable, and confirmable outcomes. The structure of the exam helps maximize the likelihood that this is the case. Preliminary exams involve multiple faculty asking many questions from different perspectives (i.e., triangulation of data sources and methods) over a length of time (i.e., prolonged engagement in the field; Anfara et al., 2002). Although preliminary exams are not research studies, they demonstrate how qualitative methods of data collection, analysis, and interpretation can be designed to maximize the sufficiency, credibility, and accuracy of the resulting data and the claims they support.

    CONCLUSION

    Many of the distinctions apparent between the cultures of science and education research resemble those seen between the cultures of science and education practice (Tanner et al., 2003). Yet, as Tanner and colleagues note and my analogies hopefully illustrate, there is also common ground. The intention of this essay and the new CBE-LSE feature is to provide windows through which scientists can get a clearer and more comprehensive view of education scholarship in a way that can inform their teaching. I invite readers to suggest current articles of interest in life science education, as well as influential papers published in the more distant past or in the broader field of education research, to be featured in the new column. Please send any suggestions to .

    ACKNOWLEDGMENTS

    I thank the Science and Health Education Partnership at the University of California at San Francisco for instigating my transition from bench to classroom-based research and the Department of Biochemistry at Virginia Tech for facilitating it. I thank Julia Grady, Deborah Johnson, Susan Kirch, David Lally, Christine Luketic, Nancy Moreno, J. Kyle Roberts, Kathryn Sykes Smith, and Kimberly Tanner for many varied and thought-provoking discussions on what education research “looks like.” I also thank Louisa Stark and Rebecca Smith for helpful suggestions and Susan Kirch for careful reading and thoughtful feedback. The preparation of this publication was made possible by grant R25 RR08529 from the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). Its contents are solely the responsibility of the authors and do not necessarily represent the official views of NCRR or NIH.

    REFERENCES

  • Aikenhead G. S., Ryan A. (1992). The development of a new instrument: views on science-technology-society (VOSTS). Sci. Educ. accessed 19 September 2007 76, 477-491 http://www.usask.ca/education/people/aikenhead/#Articles. Google Scholar
  • Anfara V. A., Brown K. M., Mangione T. L. (2002). Qualitative analysis on stage: making the research process more public. Educ. Res. 31, 28-38. Google Scholar
  • Bodner G. M., Orgill M. (2007). Theoretical Frameworks for Research in Chemistry/Science Education, Upper Saddle River, NJ: Prentice Hall. Google Scholar
  • Bower J. M. (1996). Scientists and science education reform: myths, methods, and madness accessed 11 May 2007 National Academies of Science Resources for Involving Scientists in Education http://www.nas.edu/rise/backg2a.htm. Google Scholar
  • Bransford J. D., Brown A. L., Cocking R. R. (1999). How People Learn: Brain, Mind, Experience, and School, Washington, DC: National Academies Press. Google Scholar
  • Busch A., Tanner K. D. (2006). Developing scientist educators: analysis of integrating K–12 pedagogy and partnership experiences into graduate science training the National Association for Research in Science Teaching Annual Conference April 3–6, 2006 San Francisco, CA Paper presented at. Google Scholar
  • Bush S. D., Pelaez N. J., Rudd J. A., Stevens M. T., Williams K. S., Allen D. E., Tanner K. D. (2006). On hiring science faculty with education specialties for your science (not education) department. CBE Life Sci. Educ. 5, 297-305. LinkGoogle Scholar
  • Colburn A. (2003). The Lingo of Learning: 88 Terms Every Science Teacher Should Know, Arlington, VA: NSTA Press. Google Scholar
  • Creswell J. W. (2003). Research Design: Qualitative, Quantitative, and Mixed Method Approaches, 2nd ed. Thousand Oaks, CA: Sage Publications. Google Scholar
  • Davis S. H. (2007). Bridging the gap between research and practice: what's good, what's bad, and how can one be sure?. Phi Delta Kappa 88, 568-578. Google Scholar
  • Denzin N. K., Lincoln Y.S. (2005). The Sage Handbook of Qualitative Research In: 3rd ed. London, United Kingdom: Sage Publications. Google Scholar
  • Dolan E. L., Soots B. E., Lemaux P. G., Rhee S. Y., Reiser L. (2004). Strategies for avoiding reinventing the precollege education and outreach wheel. Genetics 166, 1601-1609. MedlineGoogle Scholar
  • Drislane R., Parkinson G. (2007). Online Dictionary of the Social Sciences accessed 28 August 2007 Athabasca University http://bitbucket.icaap.org/. Google Scholar
  • Ercikan K., Roth W.-M. (2006). What good is polarizing research into qualitative and quantitative? Educ. Res. 35, 14-23. Google Scholar
  • Feuer M., Towne L., Shavelson R. (2002). Scientific culture and educational research. Educ. Res. 31, 4-14. Google Scholar
  • Firestone W. (1987). Meaning in method: the rhetoric of quantitative and qualitative research. Educ. Res. 16, 16-21. Google Scholar
  • Grossman J., Mackenzie F. J. (2005). The randomized controlled trial: gold standard, or merely standard?. Perspect. Biol. Med. 48, 516-534. MedlineGoogle Scholar
  • Guba E. G., Lincoln Y. S. (1989). Fourth Generation Evaluation, Newbury Park, United Kingdom: Sage Publications. Google Scholar
  • Hammond L., Brandt C. (2004). Science and cultural process: defining an anthropological approach to science education. Studies Sci. Educ. accessed 19 September 2007 40, 1-47 http://www.csus.edu/indiv/h/hammondl/anthscifinal4–19-04.htm. Google Scholar
  • Kearsley G. Explorations in Learning & Instruction: The Theory Into Practice Database accessed 16 June 2007 1994–2007 http://tip.psychology.org/. Google Scholar
  • Kelly A. E., Yin R. K. (2007). Strengthening structured abstracts for education research: the need for claim-based structured abstracts. Educ. Res. 36, 133-138. Google Scholar
  • Lincoln Y. S., Guba E. G. (1985). Naturalistic Inquiry, Beverly Hills, CA: Sage Publications. Google Scholar
  • McKeown R. (2003). Working with K–12 schools: insights for scientists. BioScience 53, 870-875. Google Scholar
  • Mosteller F., Nave B., Miech E. (2004). Why we need a structured abstract in education research. Educ. Res. 33, 29-34. Google Scholar
  • Olson D. R. (2004). The triumph of hope over experience in the search for “what works”: a response to Slavin. Educ. Res. 33, 24-26. Google Scholar
  • Patton M. Q. (2002). Qualitative Research & Evaluation Methods, 3rd ed. Thousand Oaks, CA: Sage Publications. Google Scholar
  • Ravitch D. (2007). EdSpeak: A Glossary of Education Terms, Phrases, Buzzwords, and Jargon, Alexandria, VA: Association for Supervision and Curriculum Development. Google Scholar
  • Schultz T. (1996). Science education through the eyes of a physicist. National Academies of Science Resources for Involving Scientists in Education accessed 11 May 2007 http://www.nas.edu/rise/backg2d.htm. Google Scholar
  • Shavelson R. J.Towne L. (2002). Scientific Research in Education, Washington, DC: National Academies Press. Google Scholar
  • Slavin R. (2002). Evidence-based education policies: transforming educational practice and research. Educ. Res. 31, 15-21. Google Scholar
  • Spillane S. A. (2004). Sharing strengths: educational partnerships that make a difference the Annual Meeting of the American Educational Research Association April 12–16, 2004 San Diego, CA Paper presented at. Google Scholar
  • St. Pierre E. A. (2006). Scientifically based research in education: epistemology and ethics. Adult Educ. Q. 56, 239-266. Google Scholar
  • Stephens T. (2004). The geneticist and the biochemist University of California Santa Cruz Review accessed 25 August 2007 Spring Issue http://review.ucsc.edu/spring04/twoversions.html. Google Scholar
  • Sweller J. (1988). Cognitive load during problem solving: effects on learning. Cognit. Sci. 12, 257-285. Google Scholar
  • Tanner K. D. (2000). Evaluation of scientist-teacher partnerships: benefits to scientist participants the National Association for Research in Science Teaching Annual Conference April 30–May 3, 2000 New Orleans, LA Paper presented at. Google Scholar
  • Tanner K. D., Chatman L., Allen D. (2003). Approaches to biology teaching and learning: science teaching and learning across the school-university divide: cultivating conversations through scientist-teacher partnerships. Cell Biol. Educ. 2, 195-201. LinkGoogle Scholar
  • Toulmin S. D. (2003). The Uses of Argument, Cambridge, United Kingdom, New York: Cambridge University Press, updated edition. Google Scholar