ASCB logo LSE Logo

Published Online:https://doi.org/10.1187/cbe.02-07-0021

Questions! Questions! Questions! When a teacher is teaching students of any age, on any topic, questions are the teacher's best friend. As a teacher, do you ask questions of your students? When do you ask questions? Are they oral questions or written questions? For what purposes do you ask questions? Do you write out in advance the questions you ask? What kinds of questions do you tend to ask? What kinds of answers do you tend to get? What do you predict would happen in your classroom if you changed the kinds of questions that you ask? How could you collect data on and analyze your questioning patterns and the impact of different kinds of questions on your students' learning? What criteria could you use to assess the effectiveness of your questions?

There are many questions to be asked about the pedagogical practice of questioning. Questions provide insight into what students at any age or grade level already know about a topic, which provides a beginning point for teaching. Questions reveal misconceptions and misunderstandings that must be addressed for teachers to move student thinking forward. In a classroom discussion or debate, questions can influence behaviors, attitudes, and appreciations. They can be used to curb talkative students or draw reserved students into the discussion, to move ideas from the abstract to the concrete, to acknowledge good points made previously, or to elicit a summary or provide closure. Questions challenge students' thinking, which leads them to insights and discoveries of their own. Most important, questions are a key tool in assessing student learning. When practiced artfully, questioning can play a central role in the development of students' intellectual abilities; questions can guide thinking as well as test for it.

Although many teachers carefully plan test questions used as final assessments of students' degree of experience with the course material, much less time is invested in oral questions that are interwoven in our teaching. Analysis of the kinds of questions we ask, whether they are oral or written, and the nature of the answers they elicit is even rarer. Given the important role of questions in teaching and learning, a method for collecting evidence about our own questioning strategies and a framework within which to analyze them has the potential to transform our teaching. Such a framework can be found in Bloom's (1956) Taxonomy of the Cognitive Domain, a classification system for cognitive abilities and educational objectives developed by educational psychologist Benjamin Bloom and his four colleagues (M. Englehart, E. Furst, W. Hill, and D. Krathwohl). Since its inception, Bloom's Taxonomy has influenced curriculum development, the construction of test questions, and our understanding of learning outcomes (Kunen et al., 1981; Kottke and Schuster, 1990). It has helped educators to match the questions they ask with the type of thinking skills they are trying to develop, and to otherwise formulate or clarify their instructional objectives.

Bloom's Taxonomy is based on the premise that there are distinct thinking behaviors that we engage in that are important in the process of learning. Bloom and colleagues grouped these behaviors into six categories that ascend in their level of complexity: from knowledge, comprehension, and application at the lower levels to analysis, synthesis, and evaluation at the higher levels. This scheme orders the six categories into a hierarchy such that cognition at each level encompasses, builds on, and is more difficult than that at the levels below it. In turn, these categories provide a framework for classifying questions that prompt students to engage in these different thinking behaviors, and thus a tool for reflecting on our own questioning strategies used in teaching.

The utility of Bloom's Taxonomy in helping to distinguish the cognitive level needed to answer a given question becomes clearer when the categories in the hierarchy are more fully described. These descriptions (a composite of descriptions found in Bloom et al., 1956; Uno, 1998; and Granello, 2000) are provided next in their ascending order in the hierarchy.1

  1. Knowledge: Recalling or recognizing previously learned ideas or phenonema (including definitions, principles, criteria, conventions, trends, generalizations, sequences, classifications and categories, and structures) in the approximate form in which they were learned. Questions asked to prompt or assess a student's thinking behavior at this lowest level in the hierarchy require only factual recall (“regurgitation”), are easy to formulate, and typically incorporate verbs or phrases such as Define, Describe, State, Name, How much is, How did, or What is.

  2. Comprehension: Understanding the literal meaning of a communication, usually demonstrated by the ability to paraphrase or summarize, to predict consequences or effects, or to translate from one form to another. Questions linked to this level of Bloom's Taxonomy require students to show more in-depth understanding and typically use the verbs or phrases Explain, Summarize, Translate, Extrapolate, What is the main idea of, or Give an example of.

  3. Application: Selecting and using information (such as rules, methods such as experimental approaches, and theories) in a new and concrete context (including solving problems and performing tasks). At this level, questions ask students to use what they know without telling them how to use it, and, in addition to Apply, use verbs such as Use, Demonstrate, Compute, Solve, or Predict.

  4. Analysis: Breaking a concept, statement, or question into its components (e.g., assumptions, hypotheses, and evidence) and explaining the relationships between the components and the organizational structures and principles involved. Analysis includes the ability to distinguish relevant information from irrelevant information and facts from inferences, and to recognize fallacies in reasoning. Questions that assess students at this level ask them to Compare, Contrast, Categorize, Discriminate, Question,or Relate. Such questions could ask either for discrimination of the key elements in a written communication and their interrelationships or for reconstruction of the process by which something was done. Analysis of experimental data requires functioning at this level.

  5. Synthesis: Integrating and combining ideas to form a new product, pattern, plan, communication, or structure (including those for abstract relationships, such as classification schemes); solving problems involving creativity or originality. Questions that ask students to function at this cognitive level typically use the verbs Design, Develop, or Propose.

  6. Evaluation: Using a specific set of internal or external criteria or standards to arrive at a reasoned judgment (decision, appraisal, or critique) about the value of material for a given purpose. Questions used to assess an individual's level of competency in this category are typically open ended, with more than one correct answer or more than one path to an answer. They use verbs such as Judge, Appraise, Rate, Defend, Revise, or Assess. Critical appraisal of research papers, particularly when the findings are controversial or inconsistent with previous findings, falls under this category.

1If you want to assess your understanding of Bloom's Taxonomy after reading theseinitial descriptions, the first paragraph of this article may be used as part of a practice quiz. Referring to each question aboutquestioning in the first paragraph of this article, can you identify the level of Bloom's Taxonomy at which the answerer would need to becompetent to answer the question? For answers to this practice quiz, see Appendix A.

For a more in-depth assessment of your understanding of Bloom's Taxonomy, you may want to take the Bloom's Quiz in Appendix B.

Appendix A Understanding Bloom's Taxonomy: Practice Quiz The levels of Bloom's Taxonomy that we assigned to each question in the opening paragraph of this article are given in the following table.

QuestionBloom's level
As a teacher, do you ask questions of your students?Knowledge
When do you ask questions?Knowledge
Are they oral questions or written questions?Knowledge
For what purposes do you ask questions?Comprehension
Do you write out in advance the questions you ask?Knowledge
What kinds of questions do you tend to ask?Analysis
What kinds of answers do you tend to get?Analysis
What do you predict would happen in your classroom if you changed the kinds of questions that you ask?Application
How could you collect data on and analyze your questioning patterns and the impact of different kinds of questions on your students' learning?Synthesis
What criteria could you use to assess the effectiveness of your questions?Evaluation

For further clarification of these categories, Table 1 provides not only a synopsis of words and phrases that often begin questions within each category, but also concrete example questions in each category that can be used to prompt thinking behaviors in students at each level of the hierarchy. Three topical areas in the life sciences—neurobiology, virology, and biological taxonomy—are used to demonstrate not only the distinctions in Bloom's categories, but also the hierarchical nature of the classification scheme.

Table 1. Examples of life science questions that can be used to prompt thinking behaviors at each level of the hierarchy in Bloom's Taxonomy of the Cognitive Domain (1956)a

Knowledge questions: Students remember and recall factual information.
Define, List, State, Label, Name, Describe• Draw a typical neuron and label at least six parts on your drawing.
• What makes up the coat of a virus?
• Name the six kingdoms of living things.
Comprehension questions: Students demonstrate understanding of ideas.
Restate, Paraphrase, Explain, Summarize, Interpret, Describe, Illustrate• What were the most important points raised in today's discussion of the differences between the functions of neurons and those of glia?
• Explain how the life cycle of a lytic virus operates.
• Describe how living things are classified into kingdoms.
Application questions; Students apply information to unfamiliar situations.
Apply, Demonstrate, Use, Compute, Solve, Predict• On the basis of what you know about axon outgrowth, how would you explain the difficulties of treating spinal cord injuries?
• Given what you know about the life cycle of a virus, what effects would you predict anti-viral drugs to have on viruses?
• If a new life form were discovered, what process would you use to assign it to a kingdom?
Analysis questions: Students break ideas down into parts.
Compare, Contrast, Categorize, Distinguish• Compare and contrast the pupillary light reflex and the patellar (knee) reflex.
• What distinguishes the replication processes of RNA and DNA viruses?
• How are fungi and plants similar to and different from each other?
Synthesis questions: Students transform ideas into something new.
Develop, Create, Propose, Formulate, Design, Invent• How might stem cell research result in therapies for diseases such as Parkinson's disease?
• Propose a way in which viruses could be used to treat a human disease.
• Develop a classification system for objects commonly found in your kitchen. State the rules of your classification system.
Evaluation questions: Students think critically and defend a position.
Judge, Appraise, Recommend, Justify, Defend, Criticize, Evaluate• Defend or criticize the statement “There is a gene for every behavior.”
• Would you argue that viruses are alive? Why or why not?
• Should the classification of living things be based on their genetic similarities or their morphology / physiology? What are the reasons for your choice?

aFirst column is a list of words that often begin questions at that level. Second column gives three questions, one for each topical area in the life sciences–neurobiology, virology, and biological taxonomy. These questions are used to demonstrate not only distinctions in Bloom's categories, but also the hierarchical nature of the classification scheme. We assume for these questions that, for the application level and above, the context is new to individuals answering the question

Although Bloom's Taxonomy is a widely accepted classification system, it has its full share of critics. Some critics have questioned its validity because of its behaviorally specified goals—that is, because it requires individuals to demonstrate mental processes in observable ways, including task performance (Pring, 1971). Many critics have suggested that although research supports the basic hierarchical structure of the classification system, the hierarchy falls down at the synthesis and evaluation levels, that these are instead two divergent processes that operate at the same level of complexity (Seddon, 1978). Other critics have pointed out that Bloom's Taxonomy fails to acknowledge past history or context. For example, if a sophisticated appraisal of a research paper emerges from a student discussion, an exam question that then asks students to evaluate these same research findings will require them to function at the lower knowledge or comprehension level, to simply recall and restate the outcomes of an evaluative discussion. Finally, as Nordvall and Braxton (1996) have pointed out, the knowledge and comprehension levels of Bloom's Taxonomy do not acknowledge that some types of information are more difficult to remember and understand. For example, most students find it easier to briefly describe three major functional types of RNA than to explain the details of how RNA is transcribed or translated. However, most educators agree that although the research on the validity of Bloom's Taxonomy is not necessarily conclusive, this taxonomy is a useful tool for making a distinction between lower-level and higher-order knowing and thinking (commonly referred to as critical thinking) and for improving our teaching.

Bloom's Taxonomy has provided a particularly useful way to investigate the congruence between course and curricular objectives and the content that is actually taught and assessed. Bloom and colleagues pointed out the utility of their model in this regard when they introduced it in the 1950s. Along with the classification system, they presented a content analysis of the types of questions that college faculty were typically asking on their course exams. They found that 70-95% of the questions that students encountered on these undergraduate exams required them to think only at the lower levels of knowledge and comprehension. Many researchers subsequently found that even 40 yr after the original publication of Bloom's Taxonomy, the typical college-level objective test question continued to assess predominantly the lower-order thinking levels (Gage and Berliner, 1992; Evans, 1999). With the advent of the National Education Standards and Project 2061 (American Association for the Advancement of Science, 1993; National Research Council, 1996) and the host of reform proposals in science education (e.g., National Science Foundation 1996), we are all striving to develop critical thinking and scientific inquiry skills in students of all ages. To do so, we should ensure that our pedagogy in general and our questioning strategies in particular extend to the analytic, synthetic, and evaluation levels of Bloom's Taxonomy. Laboratory experiences clearly have the potential to foster intellectual development (problem solving, analysis, and evaluation); however, a content analysis of 10 manuals commonly used in undergraduate chemistry laboratory courses revealed that 8 of the 10 manuals focused on questions that challenged learners to think predominantly at the three lower levels of Bloom's Taxonomy (Domin, 1999). Clearly, we have a long way to go to achieve our goal.

The point of raising these findings is not to chastise the authors of these exams and manuals. Questions at the lower levels have appropriate and legitimate uses (remember that Bloom and colleagues considered knowledge and comprehension to be foundational to more complex cognitive processes). At the very least, such questions can verify student preparation and comprehension before teachers move on to materials and strategies that promote development of the higher-order thinking skills. Rather, the point is that the assessments and questions that we use in our teaching not only drive what we teach and how we teach it, but also what students learn (this concept is informally described as “what you measure is what you get,” or WYMIWYG; Hummel and Huitt, 1994). If our course assessments require predominantly lower-level thinking, such thinking is likely to be all that we will get from our students. In other words, asking a predominance of lower-level questions on exams or as part of classroom question-answer dialogues may fixate student thinking at this level and waste opportunities for us to develop students' more complex intellectual capabilities (Napell, 1976). Conversely, if we make more forays into developing effective and appropriate questions and assessments aimed at the higher-order thinking levels in Bloom's Taxonomy, there is at least a chance that we will also be teaching more at these levels and that students will have the opportunity to develop thinking behaviors at these levels. Using Bloom's Taxonomy (or some other validated taxonomy) to perform a careful content analysis of our instructional objectives—and of questions embedded in activities, assessments, and other student experiences—can therefore help to make us conscious of the potential misalignment between what we think our objectives are and the messages we send to students through our questions. Bloom's Taxonomy, not unlike assays routinely used in the laboratory to assess the quality and quantity of proteins, cells, or nucleic acids, can serve as a tool to measure the quantity and quality of the questions we ask in our teaching.

That said, in thinking about your own teaching, we hope you will consider again, deeply, the questions that we began with: As a teacher, do you ask questions of your students? When do you ask questions? For what purposes do you ask questions? What kinds of questions do you tend to ask? What kinds of answers do you tend to get? What do you predict would happen in your classroom if you changed the kinds of questions that you ask? And perhaps most important, how could you begin to collect data on and analyze your questioning patterns? We encourage you to share your experiences with and insights on answering these questions about questions.

APPENDIX B Understanding Bloom's Taxonomy: Quiz

As you develop familiarity with the categories in Bloom's Taxonomy, it can be useful to analyze questions, decide where you might place them in the categories, and explain why. As such, we have provided this Bloom's Quiz, a collection of questions to use in probing your understanding of and insights into Bloom's Taxonomy. As described in this article, all questions used in teaching occur in a context, including the pedagogical structure in which they are presented and their relationship to the discussion of other concepts and topics. That said, these questions are relatively free of contextual information. We challenge you to think about which category or categories they most often fit into and why you place them there. We have provided answers that represent the category in which we think the question would most often fit, and in some cases we have described gray areas where the question may fit well into more than one category. We hope that in your analysis of the questions you clarify your thinking about the taxonomy and perhaps find more gray areas yourself. That said, enjoy thinking about the questions and consider doing a similar analysis on questions that you ask in your classrooms and laboratories.

BLOOM'S QUIZ

Questions

Suggested answers follow the questions.

  1. Design an experiment to test the hypothesis that some prostate cancer cells thrive after elimination of the influence of androgens because estrogen activates genes normally controlled by an androgen receptor.

  2. What factors might influence the contribution that industrial carbon dioxide emissions make to global temperature levels?

  3. How are proteins destined for export from a cell typically modified prior to secretion?

  4. Which of the following is not an event that occurs during the first division of meiosis: replication of DNA, pairing of homologous chromosomes, formation of haploid chromosome complements, crossing over, or separation of sister chromatids?

  5. Do the authors' data support their hypotheses and conclusions? Why or why not?

  6. Should embryos “left over” from in vitro fertilization procedures be used as sources of stem cells for biomedical research?

  7. Construct a concept map with the following title: Regulation of the Cell Cycle.

  8. How does the generalized life cycle of an animal differ from that of a plant?

Suggested Answers

  1. Synthesis

  2. Analysis: However, if these factors were previously discussed in class or presented in a reading assigned to students, this question involves only comprehension.

  3. Comprehension

  4. Knowledge

  5. This question intentionally brings out gray areas in trying to fit short questions to Bloom's categories without awareness of the context. According to the explanations provided in the text, the question could be at the analysis level; it requires the answerer to break down a communication about experimental findings into its components and explain their interrelationships. However, the question can take another context if, for example, it is asked in the context of peer review of a manuscript or of a student lab report. In this context, the methodology of the experiment may be open to question, or the authors may have taken an overly optimistic or confident viewpoint in interpreting their data. The answer would then require some critical appraisal (evaluation) and a knowledge of the standards used in communicating about experimental findings in a particular discipline.

  6. Evaluation: The answerer could find many written opinions on this issue through a quick search on the Internet. If other opinions were discussed or read previously and the answerer merely recapitulates another person's opinion, this question involves only comprehension.

  7. Synthesis, if the person constructing the map has not seen one before on this topic. A concept map is a collection of boxes, lines, and words used to represent understanding of major themes and ideas on a subject and how these ideas are interrelated. Maps are typically put together by placing key concepts related to the subject in the boxes, then arranging the boxes in a scheme that indicates hierarchies of importance or specificity (for example, with the “bigger ideas” at the top and a progression toward increasingly more specific concepts toward the bottom of the map). Lines drawn between boxes (propositional linkages) are used to indicate relatedness of concepts. A word or phrase above the linkage (usually a verb or an adverb) is used to indicate the nature of the relationship.

  8. Comprehension: Some people might argue that the level for this question is analysis if the answerer has not previously been told what the differences are (or read the typical introductory biology textbook treatment of animal versus plant cell cycles). Our opinion is that the cycles do not have to be broken into their components for the major differences to be evident.

  • American Association for the Advancement of Science (AAAS). (1993). Benchmarks for Science Literacy, Washington, DC: AAAS. Available online at http://www.project2061.org/tools/benchol/bolframe.htm (Project 2061 web site). Google Scholar
  • Bloom, B.S., Englehart, M.D., Furst, E.J., Hill, W.H., and Krathwohl, D.R. (1956). A Taxonomy of Educational Objectives: Handbook 1: Cognitive Domain. New York: McKay. Google Scholar
  • Domin, D.S. (1999). A content analysis of general chemistry laboratory manuals for evidence of higher order cognitive tasks.J. Chem. Ed. 76,109 -111. Google Scholar
  • Evans, C. (1999). Improving test practices to require and evaluate higher levels of thinking. Education 119,616 -618. Google Scholar
  • Gage, N.L., and Berliner, D.C. (1992).Educational Psychology , Boston: Houghton Mifflin. Google Scholar
  • Granello, D.H. (2000). Encouraging the cognitive development of supervisees: using Bloom's Taxonomy in supervision.Counselor Ed. Supervision 40,31 -46. Google Scholar
  • Hummel, J., and Huitt, W. (1994, Feb.). What you measure is what you get. GaASCD Newsletter: The Reporter,10 -11. Available online at http://chiron.valdosta.edu/whuitt/files/wymiwyg.html (last accessed June 24 2002). Google Scholar
  • Kottke, J.L., and Schuster, D.H. (1990). Developing tests for measuring Bloom's learning outcomes. Psychol. Rep. 66,27 -32. Google Scholar
  • Kunen, S., Cohen, R., and Solman, R. (1981). A levels-of-processing analysis of Bloom's Taxonomy. J. Ed. Psychol. 73,202 -211. Google Scholar
  • Napell, S.M. (1976, Winter). Six common non-facilitating teaching behaviors. Contemp. Ed. 47(2),79 -82. Google Scholar
  • National Research Council. (1996). National Science Education Standards, Washington, DC: National Academy Press. Google Scholar
  • National Science Foundation (NSF). (1996).Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering, and Technology (NSF 96-139) , Arlington, VA: NSF. Google Scholar
  • Nordvall, R.C., and Braxton, J.R. (1996). An alternative definition of quality of undergraduate education: towards usable knowledge for improvement. J. Higher Ed. 67,483 -497. Google Scholar
  • Pring, R. (1971). Bloom's Taxonomy: a philosophical critique. Camb. J. Ed. 1,83 -91. Google Scholar
  • Seddon, G.M. (1978). The properties of Bloom's Taxonomy of educational objectives for the cognitive domain. Rev. Ed. Res. 48,303 -323. Google Scholar
  • Uno, G.E. (1998). Handbook on Teaching Undergraduate Science Courses: A Survival Training Manual, Philadelphia: Saunders. Google Scholar
  • LINKS TO WEB SITES ON BLOOM'S TAXONOMY

  • Division of Instructional Development, Office of Instructional Resources, University of Illinois at Urbana-Champaign. Levels and Types of Questions. http://www.oir.uiuc.edu/did/booklets/question/quest1.html Google Scholar
  • Krumme, G. University of Washington. Major Categories in the Taxonomy of Educational Objectives. http://faculty.washington.edu/krumme/guides/bloom.html Google Scholar
  • Learning Skills Program, Counselling Services, University of Victoria. Bloom's Taxonomy. http://www.coun.uvic.ca/learn/program/hndouts/bloom.html Google Scholar