ASCB logo LSE Logo

Recent Research in Science Teaching and Learning

    Published Online:https://doi.org/10.1187/cbe.10-03-0048

    This feature is designed to point CBE−Life Sciences Education (CBE-LSE) readers to current articles of interest in life sciences education as well as more general and noteworthy publications in education research. URLs are provided for the abstracts or full text of articles. For articles listed as “Abstract available,” full text may be accessible at the indicated URL for readers whose institutions subscribe to the corresponding journal. This themed issue focuses on recent studies on development and implementation of multiple-choice tests for assessing students' conceptions of science. As context for these studies and for further discussion of the literature on assessing students' conceptions and the use of concept inventories, please see the following:

    Klymkowsky, M. W., and Garvin-Doxas, K. (2008). Recognizing student misconceptions through Ed's tools and the biology concept inventory. PLoS Biol. 6, e3.

    [Full text available: www.plosbiology.org/article/info:doi/10.1371/journal.pbio.0060003]

    Smith, J. I., and Tanner, K. (2010). The problem of revealing how students think: concept inventories and beyond. CBE Life Sci. Educ. 9, 1–5.

    [Full text available: www.lifescied.org/cgi/content/full/9/1/1]

    Treagust, D. (2006). Diagnostic assessment in science as a means to improving teaching, learning and retention. UniServe Science Proceedings of the Assessment in Science Teaching and Learning Symposium, The University of Sydney, Sydney, Australia, 28 September 2006.

    [Full text available: www.usyd.edu.au/su/SCH/pubs/procs/2006/treagust.pdf]

    Caleon, I. S., and Subramanian, R. (2009). Do students know what they know and what they don't know? Using a four-tier diagnostic test to assess the nature of students' alternative conceptions. Res. Sci. Educ. Online first (no volume or page numbers available yet).

    [Abstract available: www.springerlink.com/content/q13q3424u7717557]

    Caleon and Subramanian describe the development and validation of a four-tier, multiple-choice test aimed at diagnosing high school students' content knowledge and explanatory knowledge regarding mechanical waves. The authors design the instrument to address one of the “fundamental weaknesses” of multiple-choice questions: it is impossible to determine whether correct answers are informed by correct reasoning. The response options in two-tier diagnostic questions include alternative conceptions and thus can be useful for identifying the reasoning behind particular responses. Yet, two-tier questions suffer from a similar issue: it is impossible to distinguish between incorrect responses due to insufficient knowledge or alternative conceptions. The four-tier diagnostic questions in this study were designed to determine the nature and strength of students' alternative conceptions as well as students' confidence about their responses. Thus, each item (i.e., question) included four possible answers, at least four possible scientific reasons, and confidence ratings such that students could separately estimate how confident they were about their answers and their reasons.

    The authors used a series of propositional statements and concept maps to define the content boundaries of the test, and they identified alternative conceptions through review of the literature, observation during lessons, and examination of student work. Validation included review of content by physicists and physics teachers and review for readability, as well as pilot testing with a small sample of students who were interviewed to examine validity of their test responses. Reliability also was examined by calculating Pearson correlations using a test/retest approach. Results from analysis of 598 students' responses revealed three trends. First, on some items, students were more confident about their responses when they were incorrect, suggesting that they were unaware of their own lack of knowledge. Second, on some items, students expressed greater confidence about their answers than their reasons, indicating that they knew how to solve particular problems but could not explain the underlying scientific principles. Finally, on some items, students were more confident about their reasons than their answers, indicating that they were familiar with the underlying scientific principle but unclear on how to apply it to identify the correct answer. The authors argue that instructors can use results from four-tier diagnostic tests to tailor instruction to developing students' understanding of underlying principles versus demonstrating how to apply principles, as well as for making students aware of disconnect between what they know and what they think they know.

    Chang, C.-Y., Yeh, T.-K., and Barufaldi, J. P. (2009). The positive and negative effects of science concept tests on student conceptual understanding. Int. J. Sci. Educ. 32, 265–282.

    [Abstract available: www.informaworld.com/smpp/content∼db=all∼content=a911582565]

    In this study, Chang and colleagues document and characterize the “testing effect,” the impact of taking tests, especially those with multiple-choice questions, on students' learning. There are two mechanisms proposed to underlie the testing effect. First, students think about and select incorrect multiple-choice responses, thereby strengthening their beliefs about incorrect information. Second, the process of reading of incorrect responses may enhance students' future retrieval of that information, a phenomenon called the “retention effect.” The authors developed three types of tests on the greenhouse effect and global warming: multiple-choice (MCT), with only one correct response and one or two incorrect responses; correct-concept (CCT), with only correct responses that respondents indicated as true or false; and incorrect-concept (ICT), with only incorrect responses that respondents indicated as true or false. Thus, the question components of the items were the same for all three tests, but the response options for each item differed.

    Two hundred and eight high school students completed the MCT, CCT, ICT, or no test (NT). Their conceptions regarding global warming and the greenhouse effect were documented using the “flow map method” before and after their completion of the test. This method involves interviewing a student about his or her ideas related to a topic and then organizing responses into a map that represents the main concepts and sequence of ideas expressed by the student. Students in the MCT, CCT, and ICT groups reported significantly more concepts on the posttest flow map than the pretest map, whereas students in the NT group reported equal numbers of concepts pre to posttest. In addition, the CCT group reported a significantly greater number of correct concepts posttest compared with the MCT, ICT, and NT group, whereas the ICT group reported a significantly greater number of incorrect concepts posttest. The authors advocate for avoiding overuse of false information on tests and for use of diverse testing strategies to minimize the negative consequences of testing per se on students' learning.

    Tsui, C.-Y., and Treagust, D. (2009). Evaluating secondary students' scientific reasoning in genetics using a two-tier diagnostic instrument. Int. J. Sci. Educ. iFirst article (no volume or page numbers available yet).

    [Abstract available: www.informaworld.com/smpp/content∼db=all∼content=a912566870]

    Tsui and Treagust document the development and implementation of a two-tier multiple-choice instrument for diagnosing high school students' knowledge about genetics (tier 1) as well as their explanations of that knowledge (“reasoning,” tier 2). The authors designed the instrument to assess general, logical reasoning, such as explanations of relationships between cause and effect, and domain-specific reasoning, such as genetic explanations within and across generations. Pilot versions of the instrument were administered to several 10th grade classes (n = 65). To enhance the validity of inferences made from the two-tier results, the authors included open-ended items in the instrument and interviewed students about their genetics conceptions and reasoning. The final version of the instrument was administered to 17 12th grade students before and after their completion of a genetics unit that involved extensive use of the software-based curriculum, BioLogica (http://biologica.concord.org). Students' responses were analyzed using a paired t test and indicated that students' genetics reasoning improved after instruction. Small percentages of students selected correct responses without selecting appropriate explanations for their responses. This outcome suggests that students were able to guess correct responses or were unable to link knowledge with corresponding explanations and further supports the argument for using a two-tier format for multiple-choice questions. Results also revealed that students were successful in using cause-to-effect (genotype to phenotype) and within-generation reasoning but that they struggled with effect-to-cause (i.e., phenotype to genotype) and across-generation reasoning. The authors recommend further study with a larger sample, because the small size and inadequate diversity of the sample limit the inferences that can be made.

    The following are recent CBE-LSE articles on concept inventories:

    Garvin-Doxas, K., and Klymkowsky, M. W. (2008). Understanding randomness and its impact on student learning: lessons learned from building the biology concept inventory (BCI). CBE Life Sci. Educ. 7, 227–233.

    [Full text available: www.lifescied.org/cgi/content/abstract/7/2/227]

    Garvin-Doxas, K., Klymkowsky, M., and Elrod, S. (2007). Building, using, and maximizing the impact of concept inventories in the biological sciences: report on a National Science Foundation-sponsored conference on the construction of concept inventories in the biological sciences. CBE Life Sci. Educ. 6, 277–282.

    [Full text available: www.lifescied.org/cgi/content/abstract/6/4/277]

    Smith, M. K., Wood, W. B., and Knight, J. K. (2008). The genetics concept assessment: a new concept inventory for gauging student understanding of genetics. CBE Life Sci. Educ. 7, 422–430.

    [Full text available: www.lifescied.org/cgi/content/abstract/7/4/422]

    I invite readers to suggest current themes or articles of interest in life science education as well as influential papers published in the more distant past or in the broader field of education research to be featured in Current Insights. Please send any suggestions to Erin Dolan ().