ASCB logo LSE Logo

Biology in Bloom: Implementing Bloom's Taxonomy to Enhance Student Learning in Biology

    Published Online:https://doi.org/10.1187/cbe.08-05-0024

    Abstract

    We developed the Blooming Biology Tool (BBT), an assessment tool based on Bloom's Taxonomy, to assist science faculty in better aligning their assessments with their teaching activities and to help students enhance their study skills and metacognition. The work presented here shows how assessment tools, such as the BBT, can be used to guide and enhance teaching and student learning in a discipline-specific manner in postsecondary education. The BBT was first designed and extensively tested for a study in which we ranked almost 600 science questions from college life science exams and standardized tests. The BBT was then implemented in three different collegiate settings. Implementation of the BBT helped us to adjust our teaching to better enhance our students' current mastery of the material, design questions at higher cognitive skills levels, and assist students in studying for college-level exams and in writing study questions at higher levels of Bloom's Taxonomy. From this work we also created a suite of complementary tools that can assist biology faculty in creating classroom materials and exams at the appropriate level of Bloom's Taxonomy and students to successfully develop and answer questions that require higher-order cognitive skills.

    INTRODUCTION

    Most faculty would agree that academic success should be measured not just in terms of what students can remember, but what students are able to do with their knowledge. It is commonly accepted that memorization and recall are lower-order cognitive skills (LOCS) that require only a minimum level of understanding, whereas the application of knowledge and critical thinking are higher-order cognitive skills (HOCS) that require deep conceptual understanding (Zoller, 1993). Students often have difficulty performing at these higher levels (Zoller, 1993; Bransford et al., 2000; Bailin, 2002). In the past decade, considerable effort has been directed toward developing students' critical-thinking skills by increasing student engagement in the learning process (Handelsman et al., 2004). An essential component of this reform is the development of reliable tools that reinforce and assess these new teaching strategies.

    Alignment of course activities and testing strategies with learning outcomes is critical to effective course design (Wiggins and McTighe, 1998; Sundberg, 2002; Ebert-May et al., 2003; Fink, 2003; Tanner and Allen, 2004; Bissell and Lemons, 2006). Students are motivated to perform well on examinations; therefore, the cognitive challenge of exam questions can strongly influence students' study strategies (Gardiner, 1994; Scouller, 1998). If classroom activities focus on concepts requiring HOCS but faculty test only on factual recall, students quickly learn that they do not need to put forth the effort to learn the material at a high level. Similarly, if faculty primarily discuss facts and details in class but test at a higher cognitive level, students often perform poorly on examinations because they have not been given enough practice developing a deep conceptual understanding of the material. Either case of misalignment of teaching and testing leads to considerable frustration on the part of both instructor and student. Though considerable attention has been given to changing our classrooms to incorporate more active-learning strategies, not enough attention has been placed on how to better align assessment methods with learning goals. Indeed, one of the most significant ways to impact the quality of student learning is through the improvement of our assessments (Entwistle and Entwistle, 1992).

    How can we better assess our assessment methods? One approach is to use Bloom's Taxonomy of cognitive domains (Bloom et al., 1956), hereafter referred to as “Bloom's.” Bloom's is a well-defined and broadly accepted tool for categorizing types of thinking into six different levels: knowledge, comprehension, application, analysis, synthesis, and evaluation. A revised version of Bloom's (Anderson et al., 2001) further subcategorizes the original taxonomy and converts the different category titles to their active verb counterparts: remember, understand, apply, analyze, create, and evaluate. Bloom's has been used widely since the 1960s in K-12 education (Kunen et al., 1981; Imrie, 1995) but has seen only limited application in selected disciplines in higher education (Demetrulias and McCubbin, 1982; Ball and Washburn, 2001; Taylor et al., 2002; Athanassiou et al., 2003).

    Although Bloom's lends itself to wide application, each discipline must define the original classifications within the context of their field. In biology, Bloom's has been used to design rubrics for evaluating student performance on introductory biology exams (Bissell and Lemons, 2006), develop formative assessment questions at the appropriate cognitive level (Allen and Tanner, 2002), and inform course design (Allen and Tanner, 2007). Nonetheless, there is significant need for more comprehensive assessment tools that undergraduate biology instructors can easily use to assess student learning, guide development of teaching strategies, and promote student metacognition in the biological sciences.

    We have developed the Blooming Biology Tool (BBT; Table 1), which can be used to assess the Bloom's Taxonomy level of questions on biology-related topics. The BBT evolved out of a study we were asked to participate in that required us to rank more than 600 biology exam questions from a wide variety of sources including MCAT, GRE, and AP biology exams, as well as introductory biology and first-year medical school courses (Zheng et al., 2008). Here we present a detailed description of the BBT and complementary materials for use by college and university faculty and students. We also highlight how we implemented the BBT and associated learning activities in a variety of educational settings. We found the BBT a useful guide for faculty in diagnosing students' aptitudes and creating new assignments to help students develop critical-thinking skills. Our students used the BBT to create more challenging study questions and self-identify the skill levels that they find the most demanding.

    Table 1. Blooming Biology Tool

    Knowledge1Comprehension1Application1AnalysisSynthesisEvaluation
    LOCS2LOCS2LOCS2 HOCS3HOCS3HOCS3HOCS3
    Key skills assessedIDENTIFY, RECALL,list, recognize, or labelDESCRIBE or explain in your own words, re-tell, or summarizePREDICT an outcome using several pieces of information or concepts; use information in a new contextINFER; understand how components relate to each other and to the process as a wholeCREATEsomething new using/combining disparate sources of informationDETERMINE/CRITIQUE relative value; determine merit
    General examples of biology exam questionsIdentify the parts of a eukaryotic cell; identify the correct definition of osmosisDescribe nuclear transport to a lay person; provide an example of a cell signaling pathwayPredict what happens to X if Y increasesInterpret data, graphs, or figures; make a diagnosis or analyze a case study; compare/contrast informationDevelop a hypothesis, design an experiment, create a modelCritique an experimental design or a research proposal; appraise data in support of a hypothesis
    Type of question
        LabelingXXX
        Fill-in-the-blankXXXX
        True-falseXXXX
        Multiple-choiceXXXXX
        Short answerXXXXXX
        EssayXXXXXX
    Characteristics of multiple-choice questionsQuestion only requires information recall. Possible answers do not include significant distracters4Question requires understanding of concept or terms. Possible answers include significant distracters4Question requires prediction of the most likely outcome given a new situation or perturbation to the systemQuestion requires interpretation of data and selection of best conclusionN/A: If provided with choices, students only differentiate between possible answers rather than synthesize a novel responseQuestion requires assessment of information relative to its support of an argument

    1The first three levels of Bloom's are usually hierarchal; thus, to complete an analysis-level question, students must also demonstrate knowledge-, comprehension- and application-level skills.

    2LOCS indicates lower-order cognitive skills.

    3HOCS indicates higher-order cognitive skills.

    4Significant distracters are those answers that represent common student misconceptions on that topic.

    DEVELOPMENT OF THE BLOOMING BIOLOGY TOOL

    In developing the BBT, we first established a basic rubric that drew extensively on previous interpretations of Bloom's as it relates to biology (Allen and Tanner, 2002; Ebert-May et al., 2003; Yuretich, 2003; Bissell and Lemons, 2006). Through research and discussion, we agreed that the first two levels of Bloom's (knowledge and comprehension) represent lower orders of cognitive skills (Zoller, 1993). We considered the third level of Bloom's, application, to be a transition between LOCS and HOCS. The three remaining categories (analysis, synthesis, and evaluation) are true HOCS but are not necessarily hierarchical, meaning that a question categorized as evaluation does not always require analytical and synthesis abilities, but may require mastery of the lower three levels (knowledge, comprehension, and application). While ranking questions, we found it helpful to “check-off” each level of Bloom's required to successfully answer the question. For example, a question rated at the analysis level would require knowledge (facts), comprehension (understanding of facts), application (predicting outcomes), and analysis (inference). Each question was ranked at the highest level of Blooms' taxonomy required for its solution.

    The level of Bloom's that is assessed by a given type of exam question depends highly on what information is provided to the student and which inferences or connections the student must make on his or her own. It is equally important to consider the level of information previously provided through classroom instruction i.e., if students are explicitly given an answer to an analysis question in class and then given that same question on an exam, then that question only requires recall (Allen and Tanner, 2002). We would argue that labeling of diagrams, figures, etc., cannot assess higher than application-level thinking as this question-type, at most, requires students to apply their knowledge to a new situation. However, fill-in the blank, true-false, and multiple-choice questions can be designed to test analysis-level skills. It is nevertheless challenging to develop fill-in-the-blank questions that require higher than application-level thinking, but we have provided one such example (Supplemental Material A; Virology). Further, whereas multiple-choice questions can be designed to assess evaluation skills if they require students to determine relative value or merit (e.g., which data best support the following hypothesis), multiple-choice questions cannot assess synthesis-level thinking as all the answers are provided, eliminating the need for students to create new models, hypotheses, or experiments on their own. Many resources exist to assist faculty in designing high-quality, multiple-choice questions (Demetrulias et al., 1982; Udovic, 1996; Brady, 2005), and we have provided a list of some of these resources (Supplemental Material B).

    To differentiate between Bloom's levels, we found it useful to take one particular topic (e.g., cell biology) and develop a series of increasingly challenging exam questions representing the various levels of Bloom's. In developing these multi-level questions, we considered what a student must know or be able to do in order to answer the question. For example, if the student needed to recall factual information and then be able to describe a process in his/her own words, we considered that question to test comprehension. We have provided examples for three different subdisciplines of biology: cell biology, physiology, and virology, (Supplemental Material A). A similar approach was taken by Nehm et al. for the subdisciplines of ecology and evolution (Nehm and Reilly, 2007).

    We also found that science questions posed unique challenges to our rubric as they dealt with science-specific skills (e.g., graphing, reading phylogenetic trees, evaluating Punnett squares and pedigrees, and analyzing molecular biology data). To address this, we selected several of these science-specific skills and created examples or descriptions of question-types that would assess mastery at each level (Table 2. Through this process and extensive discussion of our work, we were able to better define and categorize the different types of questions that are typically found on biology exams. To assist us in developing the rubric, we each independently ranked approximately 100 life science exam questions and then extensively discussed our analyses to reach consensus. The BBT reflects the progression of our insights into how to adapt a general assessment method to the discipline-specific skills inherent to biology. We subsequently independently analyzed another 500 questions; statistical analysis of our rankings based on the BBT revealed high interrater reliability (agreement of at least two of the three raters over 91% of the time; [Zheng et al., 2008]).

    Table 2. Examples and descriptions of science-specific skills at different levels of Bloom's Taxonomy

    Knowledge1Comprehension1Application1AnalysisSynthesisEvaluation
    LOCS2LOCS2LOCS2 HOCS3HOCS3HOCS3HOCS3
    CalculationsEquation provided and variables identified “plug and chug”Understand/define components and variables of a given equationSolve word problems by selecting correct formula and identifying appropriate variablesSolve word problem and infer biological significance or implicationCreate an equation that describes the relationship between variablesEvaluate a computational solution to a problem or assess the relative merit(s) of using a specific mathematical tool to solve a particular problem
    Concept mapsStructure provided, student fills in the missing linking phrases or concepts that are providedStructure provided with concepts filled in, student generates linking phrases to describe relationshipsStudent creates the structure, concepts and linking phrases providedStudent creates structure, concepts are provided, student generates linking phrases to describe relationships and must link two different domains or maps togetherStudent creates structure and generates concepts and linking terms, map must be sufficiently complexStudent evaluates existing concept maps based on established criteria/rubric
    DiagnosesIdentify or list variables found in patient history, vital signs, and/or clinical test results. Know which physiological problem each named disease represents (e.g., Graves' disease, hyperthyroidism)Define each variable. Define the presenting signs and symptoms of each diseaseGiven a set of clinical variables, identify the relevant variables and make a diagnosisGiven a set of clinical variables and a diagnosis, determine which other possible diseases (differential diagnoses) need to be ruled outGiven a set of clinical variables and a diagnosis, determine the next clinical test that needs to be performed to confirm the diagnosisGiven a set of clinical variables and a diagnosis, evaluate the evidence supporting the diagnosis and provide the patient with a second opinion
    GraphingIdentify the parts of graphs and recognize different types of graphs (e.g., identify the X axis, identify a histogram)Describe the data represented in a simple graphDraw a graph based on a given set of data; predict outcomes based on data presented in graphRead and interpret a complex graph having multiple variables or treatments and explain biological implications of dataCreate a graphical representation of a given biological process or conceptAssess the relative effectiveness of different graphical representations of the same data or biological concept
    Hardy- Weinberg analysesGiven the Hardy-Weinberg (HW) equation define terms: p2, q2, 2pq; If given p+q = 1 and p = 0.7, calculate qDescribe the assumptions of the Hardy-Weinberg equation and its use as a null hypothesis What does 2pq represent in the Hardy-Weinberg equation? (HW equation not given)Determine the expected number of homo-zygous recessive individuals in a population if the recessive allele is represented in 30% of that population (HW equation not given)Determine if the following population is in HW equilibrium: 100 individuals of which 37 are SS, 8 are ss, and 55 are Ss. Defend your answerCreate a new version of the Hardy-Weinberg equation that incorporates 3 allelesAnalyze Chi-square resuts to weigh predicted evolutionary flux
    Molecular techniquesIdentify what is being measured by a molecular technique (e.g., Northern analysis measures relative RNA levels in a given cell or tissue)Understand what the results of a molecular technique indicate (e.g. the intensity of a band on a Northern blot indicates relative expression of a specific mRNA in the cell type or tissue from which the RNA was obtained)Draw the expected results you would obtain from a given molecular technique or state which technique could be used to solve a novel problem (e.g., draw the banding pattern you would expect if you analyzed a protein complex containing a 55 kDa protein and a 35 kDa protein by SDS-PAGE)Interpret the raw data obtained from a molecular technique, including the interpretation of controls and how to normalize data (e.g. interpret the results of a RT-PCR gel analysis by comparing relative expression of experimental genes to a standardized control gene)Design an experiment using a given molecular technique to test a hypothesis (e.g., design an experiment using Northern analysis to test the hypothesis that transcription factor A regulates expression of gene B)Assess relative merit of using two different molecular approaches to address a particular hypothesis (e.g., discuss the relative merits of using chromatin immunoprecipitation vs. electrophoretic mobility shift assay to test the hypothesis that a protein binds directly to the promoter of a particular gene
    Phylogenetic tree/cladogramGiven a cladogram, circle the root, nodes, or monophyletic groupsDescribe the relationship of sister taxa in a cladogramGiven four cladograms, identify which one is different and describe the evolutionary relationships that make it differentGiven a set of taxa for which all but one in a pictured tree exhibit a synapomorphy, infer the evolutionary history of one of the taxa. With respect to that same synapomorphy, discuss your conclusions about the most recent common ancestor of the pictured taxaGiven a variety of synapomorphies from different organisms, create a cladogram, identifying where the derived shared characteristics were acquiredGiven a case study showing that a group of organisms have different relationships depending on the type of data used to construct the tree, use new information provided to evaluate the collective data and infer the best true relationship of the organisms
    Punnett squares and pedigree analysesGiven a Punnett square, identify components (genotypes or phenotypes; parents or offspring) of a given genetic crossGiven parental genotypes, make Punnett square to show or describe offspring's genotypes and phenotypesGiven parental genotypes in a word problem, student identifies variables and makes Punnett square to determine genotypic or phenotypic ratios of offspring. Student is provided with information regarding dominance, sex linkage, crossing-over, etc.Given parental genotypes, make Punnett square to show or describe offspring's genotypes and phenotypes, and then solves a word problem with the new information. Student must infer relationships regarding dominance, sex linkage, crossing-over, etc.Use pedigree analysis to develop a hypothesis for how a certain disease is transmittedWeigh the relative value of different pieces of evidence (pedigree chart, incomplete transmission, linkage analysis, etc.) and determine the probability that an individual will develop a certain disease

    1The first three levels of Bloom's are usually hierarchal; thus, to complete an analysis-level question, students must also demonstrate knowledge-, comprehension-, and application-level skills.

    2LOCS indicates lower-order cognitive skills.

    3HOCS indicates higher-order cognitive skills.

    4Significant distracters are those answers that represent common student misconceptions on that topic.

    The BBT is not meant to be an absolute or definitive rubric; rather, the BBT is meant to be used as a general guide to aid both faculty and students in developing and identifying biology-related questions representing the different levels of Bloom's. As with all assessment methods, we expect the BBT to continue to evolve through an iterative process. Continuous feedback from students and faculty using the tool will inform its evolution.

    DEVELOPMENT OF THE BLOOM'S-BASED LEARNING ACTIVITIES FOR STUDENTS

    The BBT can also be used by students to help them identify the Bloom's level of exam questions that pose the greatest academic challenge. However, once these challenging areas have been identified, students also need guidance on how to modify their study habits to better prepare themselves to answer those types of questions. We therefore created the Bloom's-based Learning Activities for Students (BLASt; Table 3), a complementary student-directed tool designed to specifically strengthen study skills at each level of Bloom's. We determined which study activities provided students with the type of practice that would lead to success at each Bloom's level. For example, the first two levels of Bloom's rely heavily on memorization skills that can be reinforced by an individual student using flash cards and mnemonics. However, the remaining levels of Bloom's that represent HOCS are more readily achieved through both individual and group activities. The BLASt incorporates a range of study methods and can be used by students to refine their study skills to become more efficient and effective learners.

    Table 3. Bloom's-based Learning Activities for Students (BLASt)1

    Bloom's levelIndividual activitiesGroup activities
    Knowledge (LOCS)
    • Practice labeling diagrams

    • List characteristics

    • Identify biological objects or components from flash cards

    • Quiz yourself with flash cards

    • Take a self-made quiz on vocabulary

    • Draw, classify, select, or match items

    • Write out the textbook definitions

    • Check a drawing that another student labeled

    • Create lists of concepts and processes that your peers can match

    • Place flash cards in a bag and take turns selecting one for which you must define a term

    • Do the above activities and have peers check your answers

    Comprehension (LOCS)
    • Describe a biological process in your own words without copying it from a book or another source

    • Provide examples of a process

    • Write a sentence using the word

    • Give examples of a process

    • Discuss content with peers

    • Take turns quizzing each other about definitions and have your peers check your answer

    Application (LOCS/HOCS)
    • Review each process you have learned and then ask yourself: What would happen if you increase or decrease a component in the system or what would happen if you alter the activity of a component in the system?

    • If possible, graph a biological process and create scenarios that change the shape or slope of the graph

    • Practice writing out answers to old exam questions on the board and have your peers check to make sure you don't have too much or too little information in your answer

    • Take turns teaching your peers a biological process while the group critiques the content

    Analysis (HOCS)
    • Analyze and interpret data in primary literature or a textbook without reading the author's interpretation and then compare the authors' interpretation with your own

    • Analyze a situation and then identify the assumptions and principles of the argument

    • Compare and contrast two ideas or concepts

    • Create a map of the main concepts by defining the relationships of the concepts using one- or two-way arrows

    • Work together to analyze and interpret data in primary literature or a textbook without reading the author's interpretation and defend your analysis to your peers

    • Work together to identify all of the concepts in a paper or textbook chapter, create individual maps linking the concepts together with arrows and words that relate the concepts, and then grade each other's concept maps

    Synthesis (HOCS)
    • Generate a hypothesis or design an experiment based on information you are studying

    • Create a model based on a given data set

    • Create summary sheets that show how facts and concepts relate to each other

    • Create questions at each level of Bloom's Taxonomy as a practice test and then take the test

    • Each student puts forward a hypothesis about biological process and designs an experiment to test it. Peers critique the hypotheses and experiments

    • Create a new model/summary sheet/concept map that integrates each group member's ideas.

    Evaluation (HOCS)
    • Provide a written assessment of the strengths and weaknesses of your peers' work or understanding of a given concept based on previously determined criteria

    • Provide a verbal assessment of the strengths and weaknesses of your peers' work or understanding of a given concept based on previously described criteria and have your peers critique your assessment

    1Students can use the individual and/or group study activities described in this table to practice their ability to think at each level of Bloom's Taxonomy.

    IMPLEMENTATION OF THE BBT IN OUR CLASSROOMS

    While developing the BBT, we found that the very process of developing the BBT was strongly influencing our own teaching in the classroom. The BBT was guiding us to ask and write better questions, develop more appropriate learning strategies, and assist our students in the development of their metacognitive skills. This tool provided us with a means to consistently apply the principles of Bloom's to biology concepts and skills, thus allowing us to better assess student-learning outcomes.

    The following passages illustrate how we have applied the BBT at either a research-one institution or a liberal arts college in three different classroom contexts: (1) a small inquiry-based laboratory, (2) a large lecture, and (3) a medium-sized workshop setting. Table 4 presents the timelines of implementation of each teaching strategy. To facilitate a comparison of our different implementation strategies, we have compiled a chart outlining the strengths and challenges of each approach (Supplemental Material C).

    Table 4. Timelines for implementing the BBT in three different environments

    Faculty use of the BBT in an undergraduate cell biology laboratory course
    1st Quarter
    • Students read primary scientific literature on their topic of interest

    • Students formulate new hypotheses

    • Students design and perform pilot study to gather preliminary data

    • Students write a research proposal and receive written feedback on each draft

    Postquarter
    • Instructor designs a grading rubric to evaluate student performance on research proposal

    • Instructor uses BBT to classify Bloom's level needed to achieve success with each criterion in grading rubric

    • Instructor uses BLASt to develop new activities to help students master areas identified as weaknesses

    2nd Quarter
    • Students read primary scientific literature on their topic of interest

    • Students are introduced to grading rubric

    • Students peer-review previous quarter's research proposals

    • Students formulate new hypotheses

    • Students design and perform pilot study to gather preliminary data

    • Students write a research proposal

    Postquarter
    • Instructor uses grading rubric to evaluate student performance on research proposal

    Faculty and student use of the BBT in an undergraduate physiology course
    Day 1
    • Bloom's is introduced

    • Homework to develop a mnemonic for Bloom's is assigned

    Day 2
    • Discuss Bloom's mnemonics generated by students

    • Class is asked to “Bloom” the task of critiquing the mnemonics

    • Class is asked to “Bloom” the task of creating the mnemonic

    Each day
    • Students are asked to rank all questions according to Bloom's asked in class prior to answering the question

    Prior to exam
    • Students are given an old exam and told to Bloom each question and calculate the Bloom's distribution for the exam (i.e., what percent of points were given for questions at the level of knowledge, comprehension, etc.) This helps students realize the cognitive challenge level of the upcoming exam

    Exam
    • Instructor uses BBT to Bloom the exam questions and produces a Bloom's distribution. This helps the instructor better align the challenge of exam to course objectives

    Postexam
    • Students are shown the class average at each level of Bloom's

    • Bloom's rank of each questions is included on the exam key

    • Students enter scores for each of their exam questions into an on-line survey

    • Instructor computes each student's Bloom's score, posts the score to grade book

    • Students check their Bloom's score and view pertinent parts of BLASt

    Last day of class
    • Students are asked to submit a 1–2 paragraph response to the question “ How has using Bloom's to analyze exam performance changed your learning strategies?”

    Student use of the BBT in biology workshops at a liberal arts college
    Week 1
    • Faculty gave formal lecture on Bloom's Taxonomy

    • Students practiced using Bloom's by ranking 45 biology and chemistry questions

    Week 2
    • Students worked in small groups to write questions about assigned primary literature papers; each group wrote 2 questions at each level of Bloom's for each paper (24 total)

    • Groups exchanged questions, ranked the questions, and answered 2 questions

    Week 3
    • Faculty generated 10 questions at different levels of Bloom's

    • Students worked in small groups to rank and answer questions

    Weeks 6–10
    • Each week a student group wrote 4 questions at each of the first 5 levels of Bloom's and submitted them to the faculty at the beginning of the week

    • Faculty selected the 10 best questions for workshop

    • Students worked in small groups to answer and rank questions using the BBT; the authors of the questions acted as peer tutors during the workshop

    Use of the BBT by a Faculty Member in a Laboratory Course at a Research-One Institution

    In a small, upper-division, inquiry-driven cell biology laboratory class (two sections of 11 students each) at a research-one institution, the BBT was used to evaluate student performance and redesign course activities to enhance student learning. The class was taught during consecutive quarters with a new cohort of students each quarter. The primary writing assignment in the course (worth 1/3 of the total grade) was a National Institutes of Health (NIH)-style research proposal. This was a challenging assignment for the students as none had written a research proposal before this course and most (>75%) had no previous research experience. Over the course of the quarter, groups of three or four students read primary scientific literature on their topic of interest, formulated new hypotheses, and designed and performed a pilot study to gather preliminary data in support of their hypotheses (see Table 4 for timeline). Each student then communicated his/her ideas and findings in the form of a written research proposal in which the student posed a hypothesis and described a set of specific aims (i.e., specific research objectives for the proposed study, as defined in NIH grant proposal guidelines) designed to further test this hypothesis. The assignment also required students to provide expected outcomes of their proposed experiments and discuss possible alternate outcomes and limitations inherent in their research design. The assignment was designed to teach students how to synthesize their own data with existing data from the literature and to build a strong argument in support of a new hypothesis. Students turned in one section of the proposal each week (e.g., Background and Significance) and received written feedback. Common difficulties were discussed with the class as a whole; however, neither the grading criteria nor the rubric were made explicit to the students.

    To facilitate evaluation of the students' research proposals, a grading rubric was developed (Walvoord and Anderson, 1998; Allen and Tanner, 2006). Students were scored from 1 to 4 for how well they fulfilled each of 12 criteria as well as for overall presentation (Table 5). Student performance was gauged both by looking at the percentage of students who earned full credit on a given criterion (Table 5) and also by determining the average percentage of possible points students earned for each criterion (data not shown). In reviewing these results, it appeared that certain criteria were much more challenging for students than other criteria. For example, whereas 41% of the students provided a well-thought-out and insightful discussion of their study's broader societal and scientific impact, <10% of the students were able to design specific aims that directly tested their hypothesis (Table 5). Others have assessed students' ability to write research proposals and identified similar areas of weakness (Kolikant et al., 2006).

    Table 5. Identification of students' writing weaknesses

    Research proposal grading criteria1Percent of students fulfilling criterion2Level of Bloom's3
    Hypothesis and specific aims
        Context (logical development of hypothesis)50App/Anal
        Hypothesis33Synth/Eval
        Specific aims designed to test hypothesis9Synth/Eval
        Background & significance
        Logical introduction of background relevant to topic50Know/Comp
        Review of literature identifying gaps in knowledge27Synth/Eval
        Broader societal and scientific significance of study41Know/Comp
    Preliminary data
        Presentation of pilot study results28App/Anal
        Interpretation and relevance of pilot study28App/Anal
    Research design
        Overall design (appropriate methods, controls)32App/Anal
        Alternate outcomes for proposed study23App/Anal
        Limitations of proposed approach9Synth/Eval
        Methods32Know/Comp
    Presentation
        Overall organization, grammar, style, figures14None4

    1Students' research proposals were evaluated according to 12 different criteria as well as overall presentation.

    2The percentage of students fulfilling each criterion was determined by dividing the number of students receiving a perfect score on a particular criterion by the total number of students in the class (n = 22).

    3The highest level of Bloom's cognitive domain required to successfully complete each criterion. Know/Comp indicates knowledge and comprehension; App/Anal, application and analysis; Synth/Eval, synthesis and evaluation.

    4Presentation was not assigned a Bloom's level.

    Subsequent to determining student proficiency in each area, the BBT was used to categorize each criterion based on the highest cognitive domain it demanded (Table 5). (Please note that the section entitled “broader societal and scientific significance” was ranked as knowledge/comprehension rather than application/analysis as the instructor had explicitly discussed the significance of this general area of research during lecture and students merely had to recall and focus the information for their specific study rather than apply knowledge to a new situation.) Not surprisingly, students performed best on criteria that required only a knowledge- or comprehension-level of thinking. Those criteria that demanded an ability to synthesize new ideas or critically evaluate a technique or body of knowledge proved to be the most challenging.

    After assessing student performance on the research proposal and identifying the criteria that students found the most challenging, the instructor designed new course activities that would provide students with an opportunity to practice skills needed to complete this complex research assignment (i.e., better scaffold the assignment). Two major changes were implemented when the course was taught the subsequent quarter. First, the assessment methods were made more transparent by introducing students to the grading rubric at the beginning of the quarter. Students were also provided with numerical feedback, in addition to written feedback, on each of their drafts indicating how well they had fulfilled each of the grading criteria (e.g., on their hypothesis and specific aims section they might receive 3 out of 4 for developing a clear testable hypothesis, but only 2 out of 4 for designing specific research objectives that tested this hypothesis). Second, as suggested by the BLASt, students evaluated their peers' research proposals from the previous quarter. This activity served three purposes: (1) to further familiarize students with the grading criteria that would be used to assess their own proposals, (2) to build students' confidence by placing them in the position of evaluator, and (3) to provide students with student-created models of research proposals that they could use to guide development of their own proposals.

    To assist students in applying the grading rubric to their peers' proposals, all students were asked to evaluate the same proposal from the previous quarter, and then a “norming session” was held in which the students received the instructor's ratings with further explanation as to why a particular numerical value had been assigned. Interestingly, students on average were harsher critics of their peers than the instructor in areas where they felt most confident (e.g., presentation style), whereas they awarded higher scores than the instructor in areas where they were less knowledgeable (e.g., research design). Students were then assigned a new set of three proposals that they evaluated individually. After reviewing the proposals, students convened in groups of four to act as a “review panel” to discuss the relative strengths and weaknesses of the three proposals and come to consensus on a rank order. These activities took a significant amount of class time, but ensured that students understood each of the criteria on which their own proposals would be scored at the end of the quarter.

    Comparison of research proposal scores between the second and first quarter revealed some interesting trends. Criteria requiring the most complex thinking skills showed the most dramatic improvement (Figure 1). For example, the second quarters' students earned an average of 80% of the total possible points for discussing inherent limitations to their research design compared with only 61% in the previous quarter. Likewise, we observed a strong increase in student ability to interpret their data and design their own hypotheses, skills that require analysis and synthesis levels of Bloom's, respectively. As these data were derived from two different populations of students (fall and winter quarter), the students' scores were analyzed according to their rank order using a nonparametric Kruskal-Wallis test, which does not assume that the two data sets possess a normal distribution. Based on this analysis, all three of the most dramatic increases were found to be statistically significant (Figure 1).

    Figure 1.

    Figure 1. Increased student performance after implementation of grading rubric and peer-review panel. Student research proposals were evaluated based on 12 different criteria (1st quarter, n = 22; 2nd quarter, n = 24). The percentage increase in student performance (average % in 2nd quarter − average % in 1st quarter)/(average % in 1st quarter) × 100). A negative number indicates a decrease in the average percentage students earned in the second quarter relative to the first quarter. Asterisks indicate statistically significant differences based on a nonparametric Kruskal-Wallis test. The average score earned on the research proposal increased from 76% to 82% in the second quarter.

    Students' scores on criteria requiring LOCS did not show statistically significant differences between the two quarters, indicating that the two groups of students were equivalently matched in terms of their basal knowledge of cell biology. This lack of increase in areas of knowledge and comprehension also suggests that the newly incorporated activities primarily impacted students' HOCS. Students in the second quarter were less successful in describing experimental methods than their peers from the previous quarter; however, this is most likely attributed to the fact that students in the second quarter were asked to include methods that they were proposing to use (but had not used in the laboratory) whereas students in the first quarter were only required to include methods they had used to obtain their preliminary data (and were therefore very familiar with).

    The large increases in student performance on some of the most challenging aspects of the assignment occurred after implementation of class activities designed to enhance HOCS. However, the gains in student achievement could also be attributable to unrelated factors including quarter-to-quarter variation in student motivation or differences in faculty performance. Future research will focus on distinguishing between these different possibilities.

    As instructors, it is important that we recognize the complexity of the tasks that we are assigning students and prepare students appropriately for difficult tasks that require higher levels of thinking. As illustrated in this example, different sections of a research proposal require different cognitive skills. By recognizing which parts of an assignment are the most challenging, we can design specific activities or tools to help students succeed in those areas. Here, the faculty was able to use the BBT to identify areas in which students struggle and focus on improving the learning in these areas. The grading criteria were explicitly discussed and students were provided with structured opportunities to act as evaluators of other students' work. By sharing other students' work, it was possible to more clearly illustrate what “success” with a given criterion would or would not look like. These types of activities, based loosely on the cognitive apprenticeship model (Collins et al., 1991), may help prepare students for challenging assignments (Felzien and Cooper, 2005; Kolikant et al., 2006).

    Faculty and Student Use of the BBT in an Undergraduate Physiology Course

    Bloom's Taxonomy of cognitive domains was introduced during the second class period of a large (120 students) upper-division undergraduate physiology course at a research-one university. Introduction of Bloom's took only 15 minutes and focused on helping students learn the taxonomy and realize the potential it offered for enhancing their learning. To reinforce the concept, students were assigned the homework task of developing their own mnemonic for the levels of Bloom's (see Table 4 for timeline). For the first 10 minutes of the next class, a representative sample of mnemonics was presented, and students were asked to identify the strengths and weaknesses of each mnemonic. Before soliciting responses, the students were queried as to which level of Bloom's was required to complete these two tasks (i.e., creating a mnemonic and identifying the strengths and weaknesses of a mnemonic). In future classes, this activity would be referred to as “Blooming” the question.

    Throughout the quarter, three to four questions on course content and concepts were asked during each class period, and the students were always asked to “Bloom” each question before answering it. “Blooming” in-class questions not only affords the students practice in using Bloom's with immediate feedback from the instructor but also allows the students to gain insight into which level of question they are having the most difficulty answering. This type of exercise strengthens student metacognition as it helps them monitor their mastery of the course concepts. Enhancing student metacognition has been found to be critical to student learning (Schraw, 1998; Bransford et al., 2000; Pintrich, 2002; D'Avanzo, 2003; Coutinho, 2007).

    Physiology is a challenging subject for students as it is based on a mechanistic and analytical rather than descriptive understanding of organismal processes (Modell, 2007). As such, the discipline requires students to work predominantly at the higher levels of Bloom's Taxonomy. Few students enter the course prepared to use the HOCS required to succeed on exams; therefore, it is necessary to raise awareness of the challenge level of the exam before the exam is given. To this end, students were given a homework assignment of first categorizing each question on the previous year's exam according to Bloom's and then calculating the number of points on the exam associated with each Bloom's level. This exercise helped students gain an appreciation for the Bloom's distribution of the exam questions and allowed them to adjust their studying accordingly.

    During the quarter the instructor used the BBT to categorize the Bloom's level of all exam questions. This allowed the instructor to compute a Bloom's distribution for each exam (i.e., 16% points at the knowledge level, 38% at the comprehension level, and 46% at the application level), which in turn indicated the cognitive challenge of the exam. Calculating the Bloom's distribution allowed the instructor to determine whether indeed the exam questions were aligned with the course content and learning goals. Postexam, in addition to the routine analysis of test performance (range, means, SD) the instructor also showed how the class performed at each Bloom's level. It was not surprising to find that on the first exam students earned 80% of the knowledge points, 70% of the comprehension points, and only 55% of the application-level points.

    As the quarter progressed, the instructor recognized that it was important to provide students with their individual Bloom's scores. This was necessary as students frequently did not consider the class average to reflect their own performance, and though the Bloom's ranking of each exam question was included on the exam key, few students actually calculated their own Bloom's test score. Therefore, after the second exam was returned to the students, the students were instructed to enter their score for each exam question into an online data-collection tool. This data were then used to generate a Bloom's analysis of each student's test performance. The Bloom's test score is the percentage of points an individual student earns at each level of Bloom's (e.g., if they earned 10 of the 20 points assigned to application-level questions they earn a 50% application score). Students accessed their Bloom's test score through the grade-reporting portion of the course website. By this point in the quarter, the BLASt had been completed and made available to all students. However, students who earned <75% of the points at any Bloom's level were specifically directed to appropriate learning activities of the BLASt and strongly encouraged to incorporate those activities into their study and learning strategies. As individual Bloom's scores were not reported and the BLASt was not available until midway through the second half of the class, significant improvement in student performance on the second midterm was not anticipated.

    Research on human learning has found that developing student's ability to monitor their own learning (i.e., metacognition) is crucial to successful learning (Schraw, 1998; Bransford et al., 2000; Pintrich, 2002; D'Avanzo, 2003; Coutinho, 2007). By “Blooming” in-class questions, students are provided with daily formative assessment of their learning while the Bloom's analysis of test performance provides the student with a more focused assessment of the type of question with which they struggle. The technique of providing students with a Bloom's test score in combination with recommendations for alternative learning methods from the BLASt gives students a simple and straightforward means to monitor and change their learning strategies in biology. Unfortunately, by the time the students received their personalized Bloom's analysis of their second test performance, only two weeks remained in the 10-week quarter, and there was not enough time for students to make meaningful changes to their existing study habits. As a result, it was not possible to show significant changes to student learning over the course of the quarter. In future quarters, the personalized Bloom's analysis of test performance will be introduced at the start of the quarter, and greater emphasis will be placed on devising methods to help students learn how to implement study skills appropriate for the academic challenge of the course.

    After the quarter ended, students were asked what they thought about adding Bloom's to the course content. Below are two representative student responses:

    I think Bloom gives students an increased insight into the different types of learning and application of knowledge that students do for a class, it makes explicit something that is maybe only understood at a subconscious level. I think it gives students more tools and increases the control they have when they are studying.

    I remember initially thinking, “Why are we wasting valuable class time on Bloom's taxonomy?” I felt that Bloom's taxonomy was a burden, but I now use Bloom's taxonomy unconsciously to attack many problems. It is a method used to help organize my thoughts before I act.

    Student Use of the BBT in Biology Workshops at a Liberal Arts College

    Bloom's was used to promote pedagogical transparency and enhance students' abilities to design and answer questions in an upper-division interdisciplinary science program. Throughout the year-long program, students participated in weekly lectures, laboratories, seminars, and workshops cotaught by three different faculty who integrated topics in organic chemistry, biochemistry, cell biology, virology, and immunology. Workshops typically provided students with an opportunity to practice their problem-solving skills by answering faculty-generated questions in small groups.

    The BBT was implemented in the immunology workshops. Thirty-six students received formal training in using the BBT, and then worked collaboratively in the subsequent 10 wk of the quarter to develop questions representing all different levels of Bloom's for a variety of assigned readings (Table 4). Students were first formally introduced to Bloom's in a half-hour lecture during which the faculty used biology sample questions to exemplify the different levels. After the lecture, small groups used the BBT to rank 45 biology and 20 organic chemistry questions from GRE subject tests and faculty exams. The faculty provided assistance throughout the activity, and students were required to submit their ranked questions for credit. This process allowed students to practice using the BBT for evaluating the different levels at which questions can be written and helped them to engage in discussion about the type of questions presented.

    One wk after their initial training, students used the BBT to create questions from the content presented in eight primary literature papers that the students had previously read. Small groups of students were each assigned two papers for which they created two questions at each of the first five levels of Bloom's. The groups exchanged papers and associated questions, critiqued the level and design of the questions, and attempted to answer them. With faculty facilitation, each group presented their critique of and answer to one question to the entire class. The class then engaged in an open discussion about the material presented. These activities provided students with hands-on training for designing questions at different levels of Bloom's and set the stage for the remaining 8 wk of immunology workshops.

    During week three, the faculty generated 10 questions at each level of Bloom's covering assigned reading in an immunology textbook. In their scheduled workshop time, students met in small groups to discuss and answer the questions. For homework students were required to individually answer and rank the questions according to Bloom's. Students received credit for both their answers to the questions and their completion of Bloom's rankings.

    During the last 5 wk of the program, students were responsible for generating and answering their own questions based on assigned reading. Groups of five to seven students were responsible for writing a total of 20 weekly questions corresponding to the chapter that was being presented in lecture. Each week, a group generated four questions at each of the five levels of Bloom's. The night before the workshop, the questions were sent to the faculty and the best questions were selected and arranged in random order with respect to Bloom's ranking; the designated rankings were excluded from the final handout. In the workshop, authors of the questions served as peer teaching assistants while the other students worked to answer and rank questions. The authors were instructed to withhold the Bloom's ranking from the other students and to assist them only with finding the appropriate textbook material for answering the questions. Students were required to individually type up their answers and rank the questions according to Bloom's. These weekly assignments were turned into the faculty for grading, but students were only graded for their responses to the assigned questions and for completing the Bloom's ranking. Although exams and homework assignments given at The Evergreen State College are graded and scored, the college does not give cumulative numerical grades but rather narrative evaluations of a student's course work. This pedagogical philosophy enhances learning communities and provides an environment for effective group work. Students were held responsible for their participation in workshops by grading their individual responses to the questions.

    The goals of the course activities were to teach students about Bloom's and let them practice using the BBT to rank and write good questions at different levels so that they could independently assess the level of their understanding of biology content in the future. Based on a show of hands in class, only one student had heard of Bloom's but did not feel as though they understood it enough to use it. While students were first practicing ranking questions, the instructor formatively assessed their knowledge of Bloom's and confirmed that none of the students in the course had any experience using it. However, by the end of the course, the students were very consistent in their independent ranking of the questions according to Bloom's. For 31 of the 51 questions, greater than 80% of the students agreed on the Bloom's ranking (Figure 2). This indicates that students who are trained to use the BBT are capable of writing and identifying questions at different levels of Bloom's. Students can apply this knowledge to their studying practices, evaluating the levels at which they understand concepts and adjusting their study skills to reach higher levels of Bloom's. These findings were highlighted by students in their final written evaluations of the program; some indicated that these exercises also helped them develop better questions about material they were learning in other areas of the program. The following are evaluation responses related to the use of Bloom's in the program:

    Designing challenging questions proved to be often more difficult than answering them. Studying via question design is a skill that I will apply to new material in the future.

    A huge part of this course was learning how to use Bloom's Taxonomy which is a ranking system for formal questions. Throughout the quarter groups were required to write questions as well as answer questions based on this ranking system. Learning Bloom's Taxonomy showed me how much effort goes into designing an exam or a homework assignment. I find myself wanting more.

    All year long I engaged my peers in workshop and problem set collaboration, and while I always learn a significant amount in that setting, I was not comfortable with being led through a quarter's worth of assignments by students that knew less than me. However, I must add that [the faculty's] desire to instruct students in the art of thinking like a teacher and asking questions on many different levels of understanding was beneficial.

    Learning the different levels of questions really helped me to take tests better and increased my capacity of grasping concepts.

    Figure 2.

    Figure 2. Instruction on Bloom's assists students to agree on rankings. Thirty-four students ranked five sets of immunology questions written by their peers in the class; there were a total of 51 questions. For each question, the percentage of students who agreed on a particular ranking was determined. The total number of times that a percent agreement occurred is reported here. For all but one of the questions, >50% of the students agreed on the same ranking.

    Collectively, this suggests that formal training of students to use the BBT in ranking science questions, followed by substantive practice at writing and ranking questions at different levels of Bloom's Taxonomy, enhances their study skills and metacognitive development.

    IMPLICATIONS FOR UNDERGRADUATE BIOLOGY EDUCATION

    Assessment is the process of evaluating evidence of student learning with respect to specific learning goals. Assessment methods have been shown to greatly influence students' study habits (Entwistle and Entwistle, 1992). We agree with other educators who have argued that in the process of constructing a course, assessment is second only to establishing course learning goals for guiding course design (Wiggins and McTighe, 1998; Palomba and Banta, 1999; Pellegrino et al., 2001; Fink, 2003). Though many faculty establish learning goals for their courses, they often struggle with how to evaluate whether their formative and summative assessment methods truly gauge student success in achieving those goals.

    Most faculty would agree that we should teach and test students for higher-cognitive skills. However, when faculty are given training in how to use Bloom's and practice ranking their own exam questions, they often realize that the majority of their test questions are at the lower levels of Bloom's. For example, at a national meeting for undergraduate biology education, 97% of the faculty who attended (n = 37) and received a formal lecture on using Bloom's to rank exam questions agreed that only 25% of their exam questions tested for higher-order cognitive skills (unpublished data). Therefore, most of the time we may not be testing or providing students with enough practice at using content and science process skills at higher cognitive levels, even though our goals are that they master the material at all levels. One explanation for this discrepancy may be that biology faculty have not been given the tools and guidelines that would help them to better align their teaching with assessments of student learning. To further emphasize this point, an analysis of exam questions from courses in medical school that should be aimed at developing HOCS (Whitcomb, 2006) are instead predominantly testing at lower cognitive levels (Zheng et al., 2008).

    Developing strong assessment methods is a challenging task, and limited resources have been allocated to support faculty in this endeavor. Further, because of the current trend of increasing class size and decreasing teaching assistant support, multiple-choice exams are becoming the most practical assessment method. It is therefore increasingly important for faculty to invest the time necessary to create multiple-choice exam questions that test at the higher levels of Bloom's (Brady, 2005), as well as to develop integrative testing approaches such as requiring students to justify their answers of a small subset of multiple-choice questions (Udovic, 1996; Montepare, 2005). However, in order to accurately gauge student performance, we strongly encourage faculty to include short essay answer questions or other types of questions that test HOCS on their exams. This shift in assessment practice may require additional teaching support from departments and administrations, but we believe this is very important to the cognitive development of our students.

    Our aim in developing the BBT was to make an assessment tool for use by biology faculty and students alike. To further facilitate this process, we have created a diverse array of biology-focused examples, inclusive of both specific skills (e.g., graphing) and subdiscipline content (e.g., physiology) that biology students typically encounter. These examples, in conjunction with the BBT, are designed to aid biologists in characterizing questions according to their relative cognitive challenge and, therefore, develop assessment methods that are more closely aligned with an instructor's learning goals. The BBT can also be used in conjunction with BLASt to help students self-diagnose their learning challenges and develop new strategies to strengthen their critical-thinking skills.

    Our implementation of the BBT enhanced teaching and learning in a wide variety of instructional environments. Using the BBT, we were able to identify the cognitive levels of learning activities with which students struggle the most and adjust our teaching practices accordingly. The BBT also helped us to create pedagogical transparency and enhance student metacognition. As always, there is a trade-off when class time is used to develop metacognitive skills as opposed to focusing exclusively on course content. However, in our student-based implementation strategies of the BBT, Bloom's Taxonomy was fully integrated into the course subject matter (e.g., designing exam questions at different levels of Bloom's); anecdotal evidence from our students suggests that they continue to use Bloom's to guide their learning strategies in future classes. Given our experience and the well-documented importance of metacognition in student learning in all disciplines, including science (Schraw, 1998; Bransford et al., 2000; Pintrich, 2002; D'Avanzo, 2003; Coutinho, 2007), we consider the potential benefits students may gain from learning Bloom's to far outweigh any consequences of minimally decreasing course content.

    We envision that the BBT could help faculty create biology questions at appropriate cognitive levels and in this way provide faculty with a means to (1) assess students' mastery of both biological content and skills and (2) better align their assessments and learning objectives. We believe that use of the BBT by both faculty and students will help students achieve a deeper understanding of the concepts and skills that are required to become successful biologists. On a broader scale, the BBT could aid in development of biology assessment tools that could then be used to examine levels of academic challenge between different types of standardized exams in the life sciences and to facilitate departmental and interinstitutional comparisons of college biology courses.

    ACKNOWLEDGMENTS

    We thank J. Dorman, S. Freeman, and M. Withers for critical review of the manuscript. We owe particular thanks to S. Freeman and his two undergraduate coauthors (Zheng et al., 2008) for providing us with the inspiration and the encouragement needed to pursue this work. A.J.C. would like to thank T.S. Gross for help with statistical analysis.

    REFERENCES

  • Allen D., Tanner K. (2002). Approaches to cell biology teaching: questions about questions. Cell Biol. Educ. 1, 63-67. LinkGoogle Scholar
  • Allen D., Tanner K. (2006). Rubrics: tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE Life Sci. Educ. 5, 197-203. LinkGoogle Scholar
  • Allen D., Tanner K. (2007). Putting the horse back in front of the cart: using visions and decisions about high-quality learning experiences to drive course design. CBE Life Sci. Educ. 6, 85-89. LinkGoogle Scholar
  • Anderson L. W., Krathwohl D. R., Bloom B. S. (2001). A Taxonomy for Learning, Teaching, and Assessing a Revision of Bloom's Taxonomy of Educational Objectives, New York, NY: Longman. Google Scholar
  • Athanassiou N., McNett J. M., Harvey C. (2003). Critical thinking in the management classroom: Bloom's taxonomy as a learning tool. J. Manag. Educ. 27, 533-555. Google Scholar
  • Bailin S. (2002). Critical thinking and science education. Sci. Educ. 11, 361-375. Google Scholar
  • Ball A., Washburn S. (2001). Teaching students to think: practical applications of Bloom's taxonomy. Agr. Educ. Mag. 74, 16-17. Google Scholar
  • Bissell A. N., Lemons P. P. (2006). A new method for assessing critical thinking in the classroom. BioScience 56, 66-72. Google Scholar
  • Bloom B. S., Krathwohl D. R., Masia B. B. (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals, New York, NY: D. McKay. Google Scholar
  • Brady A. (2005). Assessment of learning with multiple-choice questions. Nurse Educ. Pract. 5, 238-242. MedlineGoogle Scholar
  • Bransford J., Brown A. L., Cocking R. (2000). How People Learn: Brain, Mind, Experience, and School, Washington, DC: National Academies Press. Google Scholar
  • Collins A., Brown J. S., Holum A. (1991). Cognitive apprenticeship: making thinking visible. Amer. Educator 6-46. Google Scholar
  • Coutinho S. A. (2007). The relationship between goals, metacognition, and academic success. Educate 7, 39-47. Google Scholar
  • Demetrulias D. A., McCubbin L. E. (1982). Constructing test questions for higher level thinking. Nurse Educator 7, 13-17. MedlineGoogle Scholar
  • D'Avanzo C. (2003). Application of research on learning to college teaching: ecological examples. BioSciences 53, 1121-1128. Google Scholar
  • Ebert-May D., Batzli J., Lim H. (2003). Disciplinary research strategies for assessment of learning. BioScience 53, 1221-1228. Google Scholar
  • Entwistle A., Entwistle N. (1992). Experiences of understanding in revising for degree examinations. Learn. Instruct. 2, 1-22. Google Scholar
  • Felzien L., Cooper J. (2005). Modeling the research process: alternative approaches to teaching undergraduates. J. College Sci. Teach. 34, 42-46. Google Scholar
  • Fink L. D. (2003). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses, San Francisco, CA: Jossey-Bass. Google Scholar
  • Gardiner L. F. (1994). Redesigning Higher Education: Producing Dramatic Gains in Student Learning, Washington, DC: George Washington University. Google Scholar
  • Handelsman J. , et al. (2004). Scientific teaching. Science 304, 521-522. MedlineGoogle Scholar
  • Imrie B. W. (1995). Assessment for learning: quality and taxonomies. Assess. Eval. Higher Educ. 20, 175-189. Google Scholar
  • Kolikant Y. B.-D., Gatchell D. W., Hirsch P. L., Linsenmeier R. A. (2006). A cognitive-apprenticeship-inspired instructional approach for teaching scientific reading and writing. J. College Sci. Teach. 36, 20-25. Google Scholar
  • Kunen S., Cohen R., Solman R. (1981). A levels-of-processing analysis of Bloom's Taxonomy. J. Educ. Psych. 73, 202-211. Google Scholar
  • Modell H. I. (2007). Helping students make sense of physiological mechanisms: the “view from the inside.”. Advan. Physiol. Edu. 31, 186-192. MedlineGoogle Scholar
  • Montepare J. (2005). A self-correcting approach to multiple choice tests. Observer 18. Google Scholar
  • Nehm R., Reilly L. (2007). Biology majors' knowledge and misconceptions of natural selection. BioScience 57, 263-272. Google Scholar
  • Palomba C. A., Banta T. W. (1999). Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education, San Francisco CA: Jossey-Bass. Google Scholar
  • Pellegrino J. W., Chudowsky N., Glaser R. (2001). Knowing What Students Know: The Science and Design of Educational Assessment, Washington, DC: National Academies Press. Google Scholar
  • Pintrich P. (2002). The role of metacognitive knowledge in learning, teaching, and assessing. Theory into Pract. 41, 219-226. Google Scholar
  • Schraw G. (1998). Promoting general metacognition awareness. Instruc. Sci. 26, 113-125. Google Scholar
  • Scouller K. (1998). The influence of assessment method on students' learning approaches: multiple choice question examination versus assignment essay. Higher Educ. 35, 453-472. Google Scholar
  • Sundberg M. D. (2002). Assessing student learning. Cell Biol. Educ. 1, 11-15. LinkGoogle Scholar
  • Tanner K., Allen D. (2004). Approaches to biology teaching and learning: from assays to assessments. Cell Biol. Educ. 3, 69-74. LinkGoogle Scholar
  • Taylor D. S., Goles T., Chin W. W. (2002). Measuring student learning. e-Service Journal 1, 41-51. Google Scholar
  • Udovic D., Morris D., Dickman A., Postlethwait J., Wetherwax P. (1996). The Workshop Biology Curriculum Handbook, Eugene: University of Oregon. Google Scholar
  • Walvoord B.E.F., Anderson V. J. (1998). Effective Grading: A Tool for Learning and Assessment, San Francisco, CA: Jossey-Bass. Google Scholar
  • Whitcomb M. E. (2006). The teaching of basic sciences in medical schools. Acad. Med. 81, 413-414. MedlineGoogle Scholar
  • Wiggins G. P., McTighe J. (1998). Understanding by Design, Alexandria, VA: Association for Supervision and Curriculum Development. Google Scholar
  • Yuretich R. F. (2003). Encouraging critical thinking. J. College Sci. Teach. 33, 40-45. Google Scholar
  • Zheng A. Y., Lawhorn J. K., Lumley T., Freeman S. (2008). Assessment: application of Bloom's Taxonomy debunks the “MCAT Myth.”. Science 319, 414-415. MedlineGoogle Scholar
  • Zoller U. (1993). Are lecture and learning compatible? Maybe for LOCS: unlikely for HOCS (SYM). J. Chem. Educ. 70, 195-197. Google Scholar