ASCB logo LSE Logo

General Essays and ArticlesFree Access

Supporting Student Competencies in Graph Reading, Interpretation, Construction, and Evaluation

    Published Online:https://doi.org/10.1187/cbe.22-10-0207

    Abstract

    Graphs are ubiquitous tools in science that allow one to explore data patterns, design studies, communicate findings, and make claims. This essay is a companion to the online, evidence-based interactive guide intended to help inform instructors’ decision-making in how to teach graph reading, interpretation, construction, and evaluation within the discipline of biology. We provide a framework with a focus on six instructional practices that instructors can utilize when designing graphing activities: use data to engage students, teach graphing grounded in the discipline, practice explicit instruction, use real world “messy” data, utilize collaborative work, and emphasize reflection. Each component of this guide is supported by summaries of and links to articles that can inform graphing practices. The guide also contains an instructor checklist that summarizes key points with actionable steps that can guide instructors as they work towards refining and incorporating graphing into their classroom practice and emerging questions in which further empirical studies are warranted.

    INTRODUCTION

    In the sciences, as well as everyday life, graphs are common tools used to visually represent quantitative data to model, explain, predict, and communicate complex information and events. Given the ubiquitous utility of graphs across fields of study (e.g., economics, marketing, meteorology) as well as increasing access to data to inform personal and social decision-making, graphing is widely recognized as a core competency for future science practitioners and an educated populace. (Padilla et al., 1986; George et al., 1996; NRC, 2003; AAMC-HHMI Committee, 2009; AAAS, 2011; NRC, 2015; NSF, 2016; Clemmons et al., 2020). In addition, the use of graphing in biology is employed throughout experimentation and inquiry including during the early design stages to data exploration and on to formal communication of findings to others (Gardner et al., 2022; Pelaez et al., 2022). Because graphs are visual models of qualitative and quantitative data that arise from observation and experimentation, the practice of graphing draws from the concepts and skills that comprise many other practices, including modeling, inquiry and experimentation, visualization, and quantitative reasoning. Therefore, in this guide we focus on these practices as they relate to graphing in biology. As a learned ability, it is important to consider how instructional enhancements can facilitate the development of graphing competence based on our current understanding of the underlying practices pertaining to the reading, interpreting, constructing, and evaluating of graphs.

    Graphing is complex practice embedded in the discipline

    Graphing competence collectively stems from the interrelated practices of reading, interpreting, constructing, and evaluating graph data. The ability to successfully make sense and use graphs as a quantitative model to explain or predict phenomena requires diverse cognitive processes that engage one’s perceptions of the data representation and relevant prior knowledge. For instance, graph construction requires the designer to identify the visualization’s purpose, perceive a display based on the data and graph design principles, transform the data and integrate disparate knowledge into the representation, and (often) employ technology to visualize and communicate the plotted relationships. In addition, knowledge relevant to graphing must come together with metacognitive processes including reflection and critique (diSessa et al., 1991; diSessa, 2004) to evaluate and unpack graph data.

    Graphing is not unique to biology, however, like a physical tool, one’s graphing “tools” are learned within a particular context according to the theory of situated cognition (Brown et al., 1989). One’s abilities to graphically display and draw relevant conclusions from observed data are influenced by the discipline and system itself under study (e.g., sources of variation and variability, variable relationships of interest [Konold et al., 2014]; knowledge and expectations about the system, measurements, and data [Shah and Hoeffner, 2002], etc). Differences across disciplines, and even subdisciplines, will be reflected in graph design and interpretation. To this end, the extent of knowledge one has of the subject being studied affects assumptions that are made about the broad understanding of graphs (Xiong et al., 2019) and expert proficiency with graph interpretation (Roth and Bowen, 2001). Indeed, even experts within a subdiscipline of biology vary in their competence with interpreting graphs (e.g., Ecology; Roth and Bowen, 2001). Finally, graphing is a social practice bound by the norms established by the communities of biology subdisciplines, as seen by how data are presented in scientific journals (Shah and Hoeffner, 2002; Roth, 2012; Weissgerber et al., 2015). Therefore, cognitive processes associated with graphing practices can and should be taught and learned, as individuals with higher levels of expertise perceive graphs differently than more novice viewers due to variation in the ability to activate prior knowledge (graphing and domain-specific) relevant to the presented data and leveraging knowledge to evaluate graphs (reflecting and critiquing).

    Barriers to development of graphing competence

    In the United States, students are exposed to the fundamentals of graphing in science throughout their K–12 education (NRC, 2015). Previous research shows that students at the secondary level are often successful plotting and reading discrete data points (Padilla et al., 1986). But challenges are observed at the college level with students’ ability to understand and think critically with data while graphing (e.g., Bray Speth et al., 2010; McFarland, 2010; Angra and Gardner, 2017; Harsh et al., 2019). Here, commonly identified areas of growth for college students include the ability to apply knowledge at the interface with statistics, ground decisions within the experimental and biological contexts from which the data arose and engage in reflective reasoning. Research has suggested that such challenges in making sense of and using graphs may result from earlier instruction (i.e., the graphing expertise of and teaching approaches used by K–12 instructors [Bowen and Roth, 2003] as well as the nature and appropriateness of graph data that students encounter in and outside of the classroom [Rybarczyk, 2011; Angra and Gardner, 2016, 2018]). Therefore, in spite of years of practice graphing in primary and secondary school science classes, students often enter the college classroom with the need to continue developing their graphing competencies to successfully engage with more complex biology concepts and data representations as well as advance their quantitative skills as applied to biological practice.

    To support instructors in postsecondary settings (i.e., graduate students, faculty, and staff) who teach graphing skills, formal professional development with lesson planning, material selection, instruction, and assessment, resources needs to be widely available (Corwin et al., 2019) to improve graphing pedagogical content knowledge (Shulman, 1987). This graphing pedagogical content knowledge includes knowledge specific to the subject, graphing conventions, the science discipline, graph teaching, the student’s prior experiences, and classroom context. As experts in their discipline, instructors may not realize their effortless and automatic thought process with graphing is invisible to students. Therefore, instructors need to prioritize time in courses to unpack their expertise and share it with students, especially highlighting disciplinary considerations and epistemologies (Bowen et al., 1999). Finally, deciding which pedagogical approaches would be most effective in which contexts and student populations can be a challenge for many instructors (Shulman, 1987; Rose and Meyer, 2002).

    In addition to the role that instructors’ teaching practices plays in the development of undergraduate biology student competence with graphing, students encounter graphs in a variety of contexts in and outside of the classroom. For instance, science textbooks often present quantitative information to students using select graph types that depict idealized data sets and/or lack common contextual graph design elements such as axes labels, units, and descriptive figure captions. While such basic displays are intended to highlight variable relationships, they often fail to represent the true “messy” nature of scientific data lending to incomplete understanding of data variability and the biological system of study for students (Rybarczyk, 2011; Angra and Gardner, 2018). This simplistic design approach can also fail to model appropriate data practices in the field (Angra and Gardner, 2018) as graphs in scientific journals regularly include multivariate plots with messy data reflecting system variability. However, even practicing scientists need to improve their graphing practices, including full depiction of variability and variation, (Weissgerber et al., 2019), further suggesting the need for training through undergraduate and graduate school.

    How to address challenges and Scope of the Guide

    This essay is a companion to the online, evidence-based interactive guide intended to help inform instructor decision-making in how to teach behaviors and skills pertaining to the reading, interpreting, constructing, and evaluating of graphs. While the general practices of graphing transcend disciplinary boundaries, this guide and our recommendations for instructional practice also attend to discipline-specific considerations and practices related to graphing in biology (Figure 1). The guide presents recommendations for measuring graphing competence based on learning objectives, and it provides a framework to assist faculty when designing and implementing graphing activities. Lastly, published examples of classroom-based graphing activities that incorporate some, or all, of the approaches in the framework are provided. Each of these central points is supported by summaries of and links to articles that can inform graphing practices. The guide also comprises an instructor checklist that summarizes key points with actionable steps that can guide instructors as they work towards refining and incorporating graphing into their classroom practice. We hope this guide is a valuable resource that supports instructors’ efforts when considering how to approach graphing in their instruction.

    FIGURE 1.

    FIGURE 1. Graphing in biology evidence-based teaching guide landing page, which provides readers with an overview of choice points.

    GENERAL INSTRUCTIONAL PRINCIPLES

    Our guide organizes literature and national documents that provide targeted objectives on building graphing competence following the principles of backward design (Wiggins and McTighe, 1998). We hope this format allows instructors to have clearly identified targets of student learning and define strategies for capturing evidence of learning and development of competence (NRC, 2001; Allen and Tanner, 2007; Martone and Sireci, 2009). Currently, there exist high-level guidelines, research literature, published instructional resources, and assessment instruments that focus on or include student competence with graphing, especially graph interpretation. Consensus documents like the Vision and Change (AAAS, 2011), the BioSkills guide (Clemmons et al., 2020), and the ACE-Bio Competencies for Experimentation in the Life Sciences (Pelaez et al., 2022) provide broad targets from which instructors can build targeted objectives relevant to their course and student contexts.

    There are different types of assessments that instructors can use depending on their purpose in revealing student competences with graphing. Well-made assessments should challenge and engage students to use their prior knowledge, model real-world scenarios, provide thorough and justified responses, and achieve validity and reliability (Wiggins, 2019). Closed response graphing tasks, such as multiple-choice questions on exams or in concept inventories, are useful in measuring granular, targeted practices (e.g., identifying a data trend) and are easy to grade; however, they are limited in lending insight to students’ higher order graph thinking. Examples of existing validated measures with graphing measures include the Biological Science Quantitative Reasoning Exam (BioSQuaRE; Stanhope et al., 2017), Statistical Reasoning in Biology Concept Inventory (SRBCI; Deane et al., 2016), and Test of Scientific Literacy Skills (TOSLS; Gormally et al., 2012). These instruments can be used in the classroom as diagnostic tools to pinpoint areas of improvement, inform instructional choices, and support programmatic assessment. While open-ended tasks can reveal greater depth of insight into aspects of graphing competence and do so in a way that is embedded in the disciplinary practice, they can be laborious to grade. However, the depth of insight around students’ decision-making and reasoning are necessary for instructors to fully understand how to support their students. One way to evaluate open-ended tasks is to use rubrics, which are effective and equitable assessment tools that can be utilized to communicate expectations, guide targeted instruction, and offer students the opportunity for self-assessment (e.g., Allen and Tanner, 2006). Natural language processing holds additional potential to gather and analyze even large-scale open-response questions to describe student thinking and skill (e.g., Jescovitch et al., 2021).

    In addition to more traditional targets and means of assessment, to help students develop reflective skills, instructors can gain important insight into student reasoning around graphing skills by prompting them to articulate their reasoning about the graph they constructed (Schmitt-Harsh & Harsh, 2013; Angra and Gardner, 2016, 2018) or the claim that they made (Harsh et al., 2019) in a way that would not be possible by simply looking at the final graphs they created. Regardless of the assessment type, graphing is a complex skill that requires frequent rounds of student practice and instructor feedback (Roth and McGinn, 1997; Roth and Bowen, 2001; McFarland, 2010; Harsh and Schmitt-Harsh, 2016). Therefore, we contend that a single instance of graph construction or interpretation is insufficient evidence for instructors to confirm proficiency.

    Inclusive Teaching

    Students enter the learning space with a rich diversity of everyday experience which contributes to their different funds of knowledge (e.g., the knowledge gained from their everyday experiences and from family [NRC, 2003]). To teach in an inclusive and equitable manner, instructors need to be aware of student graphing experiences and interests and can use early, frequent formative assessment practices to identify areas of improvement and guide the resources available to all students and target any differentiated instruction to enhance learning.

    Accessibility is an additional consideration when teaching and assessing graphing skills. As visual objects, instructor decisions regarding graph design and presentation can affect the accessibility to blind, visually impaired, and colorblind students. The use of freely available resources to guide design choices (e.g., color selection, enlarged print) and to provide accessible data displays (e.g., tactile graphics, alt text along with graphics) can remove barriers for these students (Braille Authority of North America, 2011; Levine, 2019; Stone et al., 2019; CAST, 2022). Simple instructional strategies can also improve how all students engage with visual data, but the effects are disproportionately greater for visually impaired students. These include explicit verbal descriptions, peer instruction, using alternative models (e.g., physical modeling), and varying assessments of competencies. Finally, instructors can explore ways in which their graphing instruction aligns with principles of Universal Design for Learning (Jones et al., 2011; CAST, 2022) such as recruiting student interest by allowing them to engage in student driven inquiry, clarifying language and symbols associated with graphs, or providing multiple ways to create and interact with graphs (e.g., pen-and-paper, digital). To learn more about inclusive teaching see the Inclusive Teaching Evidence Based Teaching Guide (Dewsbury and Brame, 2019)

    DESIGNING GRAPHING ACTIVITIES

    Evidence-based design features that improve graphing competence development include the following principles: 1) use data that engage students, 2) teach graphing grounded in the discipline, 3) practice explicit instruction, 4) use real-world messy data, 5) use collaborative work, and 6) emphasize reflection. While most of these features are not unique to graphing, this guide organizes what each would look like as applied in practice within the context of graphing instruction based on the literature. Integration of one of these principles will improve the classroom conversations around data. Further, instructors may realize the overlap across principles (e.g., teaching in the discipline may overlap with using real-world, messy data), which may lead to utilizing multiple principles in the classroom, which is better for student learning of graphing. This is described in detail in the Designs in Action section of the guide.

    Use data that engage students

    A sense of autonomy and connectedness to others while working on a task are two characteristics of a student’s desire to engage and learn, according to basic research in motivation (Renninger and Hidi, 2022) and theories of motivation like self-determination theory (Deci and Ryan, 2012). In the context of graphing instruction, this would include allowing students to pursue their interests in the contexts of collecting, analyzing, and interpreting data. Inquiry activities in labs and lectures can allow students to design investigations to gather and analyze their own data, find and analyze existing data gathered by others, or explore data visualizations created by themselves or others.

    The ever-increasing array of research-quality data being generated and made publicly available is an incredible potential resource for instructors to use in engaging students with data from investigations that could be beyond the scope of what could be logistically or financially possible in a given classroom. This was especially useful for instruction during the COVID-19 pandemic during which it has been challenging for students to collect data themselves, but remains a valuable inclusive practice today. These data from primary sources can increasingly be found as supplemental files for science publications and or data repositories.

    Exploring and analyzing data collected by someone else (second-hand data) versus data collected by the students themselves (first-hand hand data) has been shown to have some overlapping and some distinct learning outcomes (Hug and McNeill, 2008; DeBoy, 2017; Kjelvik and Schultheis, 2019). Therefore, instructors need to keep this in mind when deciding from where data come that students work with. As examples, well-scaffolded learning opportunities with second-hand data should prompt students to understand the nature of the variables and their natural variation, how the data were collected, affordances and limitations to the instrumentation and measurement systems, why they were collected and by whom, and understand the biological significance in a data set or graph they are reading. This knowledge comes more naturally with first-hand data collected by students.

    Teaching graphing grounded in the discipline

    As mentioned previously, in addition to biological variables and systems under study and how they are studied that affect graphing, graphing is a social practice bounded by the norms established by the biology community and rooted in the biological systems from which data arise (Roth, 2012). This includes typical and expected ways in which data are presented and graphs are constructed that are perpetuated within subfields of biology as can be seen in how data are presented in scientific journals, for better or worse (Shah and Hoeffner, 2002; Weissgerber et al., 2015). However, people overestimate the extent to which graphs will be read similarly by different people with varying degrees of knowledge related to the graph (Xiong et al., 2019). In addition, even experts within the same discipline (e.g., ecology) have sub-expertise related to concepts and processes that they study in their research that can limit or promote their understanding of graphs (Bowen et al., 1999).

    This guide summarizes work that highlights the importance of graph teaching and learning situated in the discipline of practice in benefit to students’ graphing competence and conceptual knowledge as well as the potential impact on their motivation.

    Explicit Instruction

    Graphing is a complex practice that requires the application and integration of concepts and skills from multiple disciplines (e.g., biology, statistics, cognitive science, and visual perception). Therefore, supporting students in learning graphing would benefit from purposefully and explicit instruction. As with many expert practices, the tacit knowledge needs to be made visible and the practice needs to be broken down into its constituent parts.

    One approach to support students is by providing them with stepwise approaches to teaching graphing skills. This approach not only breaks down a complex task but can provide students with a framework to remind them of steps to take along the way in graphing. Examples of graphing instructional tools that can help instructors and students with these steps are published (See step-by-step guide and guide to tables and figures in Angra and Gardner, 2016; Harsh and Schmitt-Harsh, 2016; Harsh et al., 2019). For graph construction by hand this will include planning, drawing, and reflecting (Angra and Gardner, 2016; 2017) which is similar to recommendations for graphing using data analysis software (Tairab and Khalaf Al-Naqbi, 2004, p. 130; Patterson and Leonard, 2005). For reading and interpreting graphs, evidence (Shah and Hoeffner, 2002; Harsh et al., 2019) supports instructing students to first read and decode the graph by noting and understanding the graph framework (e.g., axes and scale) and contextual features (e.g., variable labels, title, axis labels). By first orienting themselves to the graph students can move to interpreting the graph. This includes reading the data by noting and describing the patterns displayed in the graph and reading between the data through comparing variables and trends (Curcio, 1987). It also includes reading beyond the data to place the displayed data and trends back into the context from which they came (e.g., experiments) to make inferences and predictions (Curcio, 1987).

    Another approach to supporting student graphing is by teaching graphing as an ongoing journey of learning and mastering a practice. The Cognitive Apprenticeship Model (CAM; Dennen, 2004) in education supports students learning by not only breaking down tasks to make them explicit and visible but includes additional components drawing from social constructivist theory and incorporating metacognitive practices during authentic learning experiences (Collins et al., 1987; Dennen, 2004). Components of the CAM such as articulation, reflection, and exploration have been shown to be common in many sciences practices (e.g., reasoning about conclusions; Hogan and Maglienti, 2001) and specifically for graphing practices (diSessa, 2004; Maltese et al., 2015; Angra and Gardner, 2017; Harsh et al., 2019).

    Use real world messy data

    In contrast to the often simplified and stylized representations of data within textbooks (Hoskins et al., 2007; Rybarczyk, 2011; Angra and Gardner, 2018), biological data and the systems from which they are measured are inherently variable and “messy”. This messiness arises from natural variation, measurement errors, outliers, decisions made about data acquisition and analysis choices. Obscuring this natural variation and variability not only can affect the conceptual understanding that students have about biology and the nature of inquiry and measurements, but it can affect the conclusions they draw from data (Schultheis and Kjelvik, 2020). For example, if a student only encounters graphs with smooth relationships between the plotted variables, they may mistakenly expect that the relationships are clear and consistent leading to stronger than warranted claims about the data they see depicted (Schultheis and Kjelvik, 2020). A student might also be confused when plotting their own data from an inquiry that does not match their expectation for how science is done and they could mistakenly attribute the source of variation to measurement error, not appreciating the natural variation inherent to biological systems.

    Students’ acquisition of disciplinary concepts and inquiry processes, development of mathematical and statistical literacy, ability to draw appropriate conclusions from data, and comprehension of their surroundings are all enhanced when they are exposed to real-world messy data while creating and viewing graphs (Kastens et al., 2015; Kjelvik and Schultheis, 2019; Schultheis and Kjelvik, 2020). Having students work with real world messy data during data analysis, including constructing and reading graphs, will support students’ understanding of the true nature of biology and provide them with the opportunity to further develop their quantitative reasoning and critical thinking (Kastens et al., 2015; Kjelvik and Schultheis, 2019; Schultheis and Kjelvik, 2020). Instructors can also use this as an opportunity to connect raw data to the biological concepts covered in the classroom and also the importance of consistent data collection during investigations (e.g., Pelaez et al., 2022). While experiences for data collection can be easily accomplished in laboratory courses, it is also possible for instructors in lecture courses to utilize case studies (e.g., Data Nuggets, HHMI Biointeractive) or publicly available data sets (e.g., data.gov; usafacts.org) in large lecture classrooms. The guide summarizes work that highlights the importance of utilizing messy data as well as resources and public databases for instructors to utilize in their classroom.

    Utilize collaborative work

    Graphs invite and engage others in conversation about its contents and meaning (Roth and McGinn, 1997). Students can participate in the cooperative practices of the scientific community by explaining their thinking and negotiating diverse points of view while making data decisions through shared exercises (Roth andMcGinn, 1997). This allows for the incorporation of potentially differing viewpoints and perspectives regarding the graph’s purpose, contents, structure, and meaning. Shofner and Marbach-Ad (2017) provide examples of two inquiry-based graphing activities that were implemented in a large-enrollment introductory biology classroom. Students were asked to work in teams to write hypotheses, incorporate biological ideas (e.g., equilibrium) into their graphs, interpret graphs, and participate in the peer review process. Although 80% of the students reported positive feedback working in the group setting, 15% reported negative feedback, which stems from their preference for listening to information, working alone, and lacking confidence in the biology content knowledge to provide feedback to their peers (Shofner and Marbach-Ad, 2017). To ease students into working in teams, it is recommended that each group member be given a distinct task (e.g., recorder, facilitator, presenter; Shofner and Marbach-Ad, 2017). To learn more about implementing group work in undergraduate biology courses, see the Group Work Evidence Based Teaching Guide (Wilson et al., 2017).

    Emphasize reflection

    The last element in the design framework is to offer students the chance to evaluate and reflect on graphs that they have made or that they have read and interpreted from external resources (e.g., primary literature, media, textbooks, peers). The ability to think critically and self-reflect are essential methods for students to learn (Tynjälä, 1999) as these are often indicative of expert behaviors (diSessa, 2004; Angra and Gardner, 2017). Competency with graphing requires one to go beyond representational competence (generating and making meaning from graphs) and engage in designing new graphs, explaining graphs, understanding the role a graph plays, and critiquing and evaluating the affordances and limitations of a given graph over other possible alternatives. This is known as metarepresentational competence (MRC; diSessa et al., 1991, 2004). To increase students’ confidence, refine their critical thinking skills, and strengthen the learning or reflection component of the MRC, written reflections should be modeled by the instructor, performed, and practiced by students several times throughout a course. McFarland (2010) shares a 90-min active-learning activity that requires college-level biology students enrolled in a laboratory (or lecture) course to collaboratively practice cognitive and metacognitive skills to improve their graph literacy. Specific series of questions around the purpose of graphs, criteria for appropriate graph construction, and alternative graph types are provided in McFarland (2010). To learn more about promoting student metacognition in undergraduate biology courses, see Tanner (2012) and the Fostering Metacognition to Support Student Learning and Performance Evidence Based Teaching Guide (Stanton et al., 2021).

    DESIGNS IN ACTION

    This section of the guide presents published examples that apply one or more of the evidence-based principles for graph instruction to the setting of either the biology lecture or an experiential setting (e.g., laboratory course). To develop competence with graphing, it must be practiced often by students in as many contexts as possible. Small modifications to existing curriculum in the lecture and laboratory settings can lead to large benefits for students. Instructors can utilize free, field-tested materials from online websites (HHMI, QUBES Hub, Data Nuggets, Course Source, etc.) that provide learning objectives, short graphing exercises and assessments that can be easily incorporated into the curriculum. While instructors may not be able to devote an entire semester to a large-scale intervention, they can emphasize one or more elements from the designing graphing activities framework (Gardner et al., 2022). Instructors can very easily emphasize graphs in the classroom by taking the time to explain their expert thinking as they are constructing, reading, or interpreting a graph. This small change can be effective at helping students understand the graphing parameters so that they can practice in the same manner to reach expert-level mastery.

    Having students collaborate, collect and curate authentic, messy, meaningful data promotes student confidence, motivation, and encourages them to engage in higher-order thinking by connecting experimental design with graphing (Gardner et al., 2022). Three studies that demonstrate the design, implementation, and assessment procedures that include all elements of the designing graphing activities framework are highlighted. The first study is an inquiry-based ecology unit for introductory, nonscience students that emphasizes graphing to study a real-world problem, the water quality of a campus stream (Harsh and Schmitt-Harsh, 2016). The second study is a semester-long intervention in an animal behavior lecture classroom which showed how incorporating graphing materials such as, Data Nuggets (Schultheis and Kjelvik, 2015), Graph Rubric (Angra and Gardner, 2018), and the Step-by-Step Guide (Angra and Gardner, 2016) improved students’ ability to interpret the purpose of a graph, understand nature of the data, explain relationships between independent and dependent variables and the compose a take-home message (Weigel and Angra, 2023). The last example is of a 4-year interventional study in an upper-level physiology laboratory course. The authors found that using the evidence-based graphing materials coupled with explicit instruction is beneficial to student learning and graphical skill development (Gardner et al., 2022). Complete summaries of these examples and more studies can be found in the Designs in Action portion of the guide.

    EMERGING QUESTIONS

    While there is an existing and growing evidence base to guide instructional choices, additional research within clinical (e.g., interviews) and naturalistic learning settings will further strengthen our knowledge of student competence development and the ways in which instructional approaches support it. Below we offer questions that emerged as we put together this evidence-based teaching guide.

    A few questions remain unexplored that could further our understanding of how graph construction and interpretation activities are designed and taught:

    • How do students construct and interpret graphs in groups?

    • How is the relationship between graph construction and interpretation taught in the classroom?

    • How do instructors of varying graphical expertise approach teaching these skills?

    • Over the course of a semester or the undergraduate course of study, how many opportunities do students need with graph construction and interpretation to master these skills?

    Much of what we know about students’ competence with graphing comes from assessments outside of the practices of graphing within the context of inquiry (e.g., McKenzie and Padilla, 1986; Gormally et al., 2012; Stanhope et al., 2017). Therefore, it will be important to answer additional questions:

    • What are the graphing practices that students use when engaged in the naturalistic settings of biological inquiry?

    • What is the nature of tasks that motivates or necessitates that students engage in graphing?

    • What do students view as the purpose of graphing in their courses and learning of biology?

    • How much practice with graphing is needed until it becomes a “habit of mind” and a part of students’ approach to thinking with data?

    In order help instructors support their students’ competence with graphing additional questions need to be answered to understand instructional design decisions:

    • Which instructional design features, or combination of features, are most important when teaching graphing? In what ways do they support individual competence development?

    • How can instructors encourage frequent and effective reflections on graphs inside and outside the classroom?

    • How can graphical knowledge be scaffolded and advanced across undergraduate and graduate studies?

    CONCLUSION

    Teaching graph construction and interpretation is most effective when the learning objectives and assessments are clearly articulated, are relevant to and appropriate for the student and course context. Our guide provides a framework on best practices for teaching graph construction and interpretation. We provide resources that allow for integration of graphing skills into the classroom. We recommend the incorporation and application of several evidence-based design principles that engage students and improve graphing competence. The development of competence with graphing is important not only for students in their undergraduate biology courses, but it equips students with the tools and knowledge needed to make decisions in their daily lives.

    ACKNOWLEDGMENTS

    This work benefited from the research on graph construction in undergraduate biology conducted under National Science Foundation #1726180 & 2111150 (S.M.G.) and the collaborations and opportunities within the ACE-Bio Network (National Science Foundation #1346567). Any opinions, findings, and conclusions, or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Many thanks to the members of the Purdue International Biology Education Research Group (PIBERG) and the James Madison University Biology Education Research Group (jmuBERG) for their valuable feedback and assistance over the years. We thank Drs. Mary Hegarty and David Uttal for their feedback and paper recommendations at the outset of the creation of this guide.

    REFERENCES

  • AAMC-HHMI Committee (2009). Scientific foundations for future physicians (pp. 26–29). Washington, DC: Association of American Medical Colleges. Google Scholar
  • Allen, D., & Tanner, K. (2006). Rubrics: Tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE—Life Sciences Education, 5(3), 197–203. LinkGoogle Scholar
  • Allen, D., & Tanner, K. (2007). Putting the horse back in front of the cart: Using visions and decisions about high-quality learning experiences to drive course design. CBE—Life Sciences Education, 6(2), 85–89. LinkGoogle Scholar
  • American Association for the Advancement of Science (2011). Vision and change in undergraduate biology education: A call to action, final report from July 15–17, 2009. Washington, DC: AAAS. Google Scholar
  • Angra, A., & Gardner, S. M. (2016). Development of a framework for graph choice and construction. Advances in Physiology Education, 40, 123–128. MedlineGoogle Scholar
  • Angra, A., & Gardner, S. M. (2017). Reflecting on graphs: Attributes of graph choice and construction practices in biology. CBE—Life Sciences Education, 16(3), ar53. LinkGoogle Scholar
  • Angra, A., & Gardner, S. M. (2018). The graph rubric: Development of a teaching, learning, and research tool. CBE—Life Sciences Education, 17(4), ar65. LinkGoogle Scholar
  • Bowen, G. M., Roth, W. M., & McGinn, M. K. (1999). Interpretations of graphs by university biology students and practicing scientists: Toward a social practice view of scientific representation practices. Journal of Research in Science Teaching, 36(9), 1020–1043. Google Scholar
  • Bowen, G. M., & Roth, W. M. (2003). Graph interpretation practices of science and education majors. Canadian Journal of Science, Mathematics and Technology Education, 3, 499–512. Google Scholar
  • Braille Authority of North America (2011). Guidelines and standards for tactile graphics. https://www.brailleauthority.org/tg/web-manual/index.html Google Scholar
  • Bray Speth, E., Momsen, J. L., Moyerbrailean, G. A., Ebert-May, D., Long, T. M., Wyse, S. A., & Linton, D. (2010). 1, 2, 3, 4: Infusing quantitative literacy into introductory biology. CBE—Life Sciences Education, 9, 323–332. MedlineGoogle Scholar
  • Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42. Google Scholar
  • CAST (2022). Universal Design for Learning Guidelines version 2.2. http://udlguidelines.cast.org Google Scholar
  • Clemmons, A. W., Timbrook, J., Herron, J. C., & Crowe, A. J. (2020). BioSkills guide: Development and national validation of a tool for interpreting the Vision and Change core competencies. CBE—Life Sciences Education, 19(4), ar53. LinkGoogle Scholar
  • Collins, A., Brown, J. S., & Newman, S. E. (1987). Cognitive apprenticeship: Teaching the craft of reading, writing and mathematics (Technical Report No. 403) (pp. 1–37). Cambridge, MA: BBN Laboratories. Google Scholar
  • Corwin, L. A., Kiser, S., LoRe, S. M., Miller, J. M., & Aikens, M. L. (2019). Community College Instructors’ Perceptions of Constraints and Affordances Related to Teaching Quantitative Biology Skills and Concepts. CBE—Life Sciences Education, 18(4), ar64. LinkGoogle Scholar
  • Curcio (1987). Comprehension of mathematical relationships expressed in graphs. Journal for Research in Mathematics Education, 18, 382–393. Google Scholar
  • Deane, T., Nomme, K., Jeffery, E., Pollock, C., & Birol, G. (2016). Development of the statistical reasoning in biology concept inventory (SRBCI). CBE—Life Sciences Education, 15(1), ar5. LinkGoogle Scholar
  • DeBoy, C. A. (2017). Student Use of self-data for out-of-class graphing activities increases student engagement and learning outcomes. Journal of Microbiology & Biology Education, 18(3), 1–23. https://doi.org/10.1128/jmbe.v18i3.1327 Google Scholar
  • Deci, E. L., & Ryan, R. M. (2012). Self-determination theory. In Van Lange, P. A. M.Kruglanski, A. W.Higgins, E. T. (Eds.), Handbook of theories of social psychology (Vol. 1, pp. 416–437). Thousand Oaks, CA: Sage Publications. https://doi.org/10.4135/9781446249215.n21 Google Scholar
  • Dennen, V. P. (2004). Cognitive apprenticeship in educational practice: Research on scaffolding, modeling, mentoring, and coaching as instructional strategies. Handbook of Research on Educational Communications and Technology, 2(2004), 813–828. Google Scholar
  • Dewsbury, B., & Brame, C. J. (2019). Inclusive teaching. CBE—Life Sciences Education, 18(2), fe2. LinkGoogle Scholar
  • diSessa, A. A., Hammer, D., Sherin, B., & Kolpakowski, T. (1991). Inventing graphing: Meta-representational expertise in children. Journal of Mathematical Behavior, 10, 117–160. Google Scholar
  • diSessa, A. A. (2004). Metarepresentation: Native competence and targets for instruction. Cognition and Instruction, 22, 293–331. Google Scholar
  • Gardner, S. M., Angra, A., & Harsh, J. A. (2022). A Framework for Teaching and Learning Graphing in Undergraduate Biology. In Pelaez, N.Anderson, T.Gardner, S. M. (Eds.), Trends in Teaching Experimentation in the Life Sciences (pp. 143–170). Switzerland: Springer Nature. Google Scholar
  • George, M., Bragg, S., de los Santos Jr, A. G., Denton, D. D., Gerber, P., Lindquist, M. M., ... & Meyer, C. (1996). Shaping the Future: New Expectations for Undergraduate Education in Science. In: Mathematics, Engineering and Technology (pp. 1–414). Arlington, VA: National Science Foundation. Google Scholar
  • Gormally, C., Brickman, P., & Lutz, M. (2012). Developing a test of scientific literacy skills (TOSLS): Measuring undergraduates’ evaluation of scientific information and arguments. CBE—Life Sciences Education, 11(4), 364–377. LinkGoogle Scholar
  • Halmo, S. M., Sensibaugh, C. A., Reinhart, P., Stogniy, O., Fiorella, L., & Lemmons, P. P. (2020). Advancing the Guidance Debate: Lessons from Educational Psychology and Implications for Biochemistry Learning. CBE—Life Sciences Education, 19(3) ar41. https://doi.org/10.1187/cbe.19-11-0260 LinkGoogle Scholar
  • Harsh, J. A., & Schmitt-Harsh, M. L. (2016). Instructional strategies to develop graphing skills in the college science classroom. The American Biology Teacher, 78(1), 49–56. Google Scholar
  • Harsh, J. A., Campillo, M., Murray, C., Myers, C., Nguyen, J., & Maltese, A. V. (2019). “Seeing” data like an expert: An eye-tracking study using graphical data representations. CBE—Life Sciences Education, 18(3), ar32. MedlineGoogle Scholar
  • Hogan, K., & Maglienti, M. (2001). Comparing the epistemological underpinnings of students' and scientists' reasoning about conclusions. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 38(6), 663–687. Google Scholar
  • Hoskins, S. G., Stevens, L. M., & Nehm, R. H. (2007). Selective use of the primary literature transforms the classroom into a virtual laboratory. Genetics, 176(3), 1381–1389. MedlineGoogle Scholar
  • Hug, B., & McNeill, K. L. (2008). Use of first-hand and second-hand data in science: Does data type influence classroom conversations? International Journal of Science Education, 30(13), 1725–1751. Google Scholar
  • Jescovitch, L. N., Scott, E. E., Cerchiara, J. A., Merrill, J., Urban-Lurain, M., Doherty, J. H., & Haudek, K. C. (2021). Comparison of machine learning performance using analytic and holistic coding approaches across constructed response assessments aligned to a science learning progression. Journal of Science Education and Technology, 30(2), 150–167. Google Scholar
  • Jones, J. L., Jones, K. A., & Vermette, P. J. (2011). Planning learning experiences in the inclusive classroom: Implementing the three core UDL principles to motivate, challenge and engage all learners. Electronic Journal for Inclusive Education, 2(7), 6. Google Scholar
  • Kastens, K., Krumhansl, R., & Baker, I. (2015). THINKING BIG: Transitioning your students from working with small, student-collected data sets toward “big data. The Science Teacher, 82(5), 25–31. Google Scholar
  • Kjelvik, M. K., & Schultheis, E. H. (2019). Getting messy with authentic data: Exploring the potential of using data from scientific research to support student data literacy. CBE—Life Sciences Education, 18(2), es2. LinkGoogle Scholar
  • Konold, C., Higgins, T., Russell, S. J., & Khalil, K. (2014). Data seen through different lenses. Educational Studies in Mathematics, 88(3), 305–325 Google Scholar
  • Levine, A. (2019, March 1) True colors: Optimizing charts for readers with color vision deficiencies. https://itstraining.wichita.edu/optimize_for_vision_deficiencies/ Google Scholar
  • Maltese, A. V., Harsh, J. A., & Svetina, D. (2015). Data visualization literacy: Investigating data interpretation along the novice—expert continuum. Journal of College Science Teaching, 45(1), 84–90. Google Scholar
  • Martone, A., & Sireci, S. G. (2009). Evaluating alignment between curriculum, assessment, and instruction. Review of Educational Research, 79(4), 1332–1361. Google Scholar
  • McFarland, J. (2010). Teaching and assessing graphing using active learning. MathAMATYC Educator, 1(2), 32–39. Google Scholar
  • McKenzie, D. L., & Padilla, M. J. (1986). The construction and validation of the test of graphing in science (TOGS). Journal of Research in Science Teaching, 23(7), 571–579. Google Scholar
  • National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press. Google Scholar
  • National Research Council (2003). BIO2010: Transforming Undergraduate Education for Future Research Biologists. Washington, DC: National Academies Press. Google Scholar
  • National Research Council. (2015). Guide to implementing the next generation science standards. Washington, DC: The National Academies Press. Google Scholar
  • National Science Foundation (2016). 10 Big Ideas. Alexandria, VA: National Science Foundation. https://www.nsf.gov/news/special_reports/big_ideas/ Google Scholar
  • Patterson, T. F., & Leonard, J. G. (2005). Turning spreadsheets into graphs: An information technology lesson in whole brain thinking. Journal of Computing in Higher Education, 17(1), 95–115 Google Scholar
  • Padilla, M. J., McKenzie, D. L., & Shaw, E. L. (1986). An examination of the line graphing ability of students in grades seven through twelve. School Science and Mathematics, 86(1), 20–26. Google Scholar
  • Pelaez, N., Gardner, S. M., & Anderson, T. (2022). The problem with teaching experimentation: Development and use of a framework to define fundamental competencies for biological experimentation. In Pelaez, N.Anderson, T.Gardner, S. M. (Eds.), Trends in Teaching Experimentation in the Life Sciences (pp. 3–27). Switzerland: Springer Nature. Google Scholar
  • Renninger, K., & Hidi, S. E. (2022). Interest development, self-related information processing, and practice. Theory Into Practice, 61(1), 23–34. Google Scholar
  • Rose, D. H., & Meyer, A. (2002). Teaching every student in the digital age: Universal design for learning (pp. 1–228). Alexandria, VA: Association for Supervision and Curriculum Development: Google Scholar
  • Roth, W. M., & McGinn, M. K. (1997). Graphing: Cognitive ability or practice? Science Education, 81(1), 91–106. Google Scholar
  • Roth, W. M., & Bowen, G. M. (2001). Professionals read graphs: A semiotic analysis. Journal of Research in Mathematics Education, 32(2), 159–194. Google Scholar
  • Roth, W. M. (2012). Undoing decontextualization or how scientists come to understand their own data/graphs. Science Education, 97(1), 80–112. Google Scholar
  • Rybarczyk, B. (2011). Visual literacy in biology: A comparison of visual representations in textbooks and journal articles. Journal of College Science Teaching, 41(1), 106. Google Scholar
  • Schmitt-Harsh, M., & Harsh, J. A. (2013). The development and implementation of an inquiry-based poster project on sustainability in a large non-majors environmental science course. Journal of Environmental Studies and Sciences, 3(1), 56–64. Google Scholar
  • Schultheis, E. H., & Kjelvik, M. K. (2015). Data nuggets: Bringing real data into the classroom to unearth students’ quantitative & inquiry skills. The American Biology Teacher, 77(1), 19–29. Google Scholar
  • Schultheis, E. H., & Kjelvik, M. K. (2020). Using messy, authentic data to promote data literacy & reveal the nature of science. The American Biology Teacher, 82(7), 439–446 Google Scholar
  • Shah, P., & Hoeffner, J. (2002). Review of graph comprehension research: Implications for instruction. Educational Psychology Review, 14(1), 47–69. Google Scholar
  • Shofner, M. A., & Marbach-Ad, G. (2017). Group activity to enhance student collaboration, graph interpretation, and peer evaluation of ecological concepts in a large-enrollment class. Journal of Microbiology & Biology Education, 18(3), 18–13. Google Scholar
  • Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57(1), 1–23. Google Scholar
  • Stanhope, L., Ziegler, L., Haque, T., Le, L., Vinces, M., Davis, G. K., … & Overvoorde, P. J. (2017). Development of a biological science quantitative reasoning exam (BioSQuaRE). CBE—Life Sciences Education, 16(4), ar66. LinkGoogle Scholar
  • Stanton, J. D., Sebesta, A. J., & Dunlosky, J. (2021). Fostering metacognition to support student learning and performance. CBE—Life Sciences Education, 20(2), fe3. LinkGoogle Scholar
  • Stone, B. W., Kay, D., & Reynolds, A. (2019). Teaching visually impaired college students in introductory statistics. Journal of Statistics Education, 27(3), 225–237. Google Scholar
  • Tairab, H. H., & Al-Naqbi, A. K. (2004). How do secondary school science students interpret and construct scientific graphs? Journal of BiologicalEducation, 38, 127–132. Google Scholar
  • Tanner, K. D. (2012). Promoting student metacognition. CBE—Life Sciences Education, 11(2), 113–120. LinkGoogle Scholar
  • Tynjälä, P. (1999). Towards expert knowledge? A comparison between a constructivist and a traditional learning environment in the university. International Journal of Educational Research, 31(5) 357–442. https://doi.org/10.1016/S0883-0355(99)00012-9 Google Scholar
  • Xiong, C., Van Weelden, L., & Franconeri, S. (2019). The curse of knowledge in visual data communication. IEEE Transactions on Visualization and Computer Graphics, 26(10), 3051–3062. MedlineGoogle Scholar
  • Weigel, E. G., & Angra, A. (2023). Teaching in Tandem. Journal of College Science Teaching, 52(4), 78–86. Google Scholar
  • Weissgerber, T. L., Milic, N. M., Winham, S. J., & Garovic, V. D. (2015). Beyond bar and line graphs: Time for a new data presentation paradigm. PLoS Biology, 13(4), e1002128. MedlineGoogle Scholar
  • Weissgerber, T. L., Winham, S. J., Heinzen, E. P., Milin-Lazovic, J. S., Garcia-Valencia, O., Bukumiric, Z., … & Milic, N. M. (2019). Reveal, don’t conceal: Transforming data visualization to improve transparency. Circulation, 140(18), 1506–1518. MedlineGoogle Scholar
  • Wiggins, G. (2019). The case for authentic assessment. Practical Assessment, Research, and Evaluation, Vol. 2, Article 2. https://scholarworks.umass.edu/pare/vol2/iss1/2 Google Scholar
  • Wiggins, G., & McTighe, J. (1998). Backward design. In: Understanding by design (pp. 13–34). Alexandria, VA: Association for Supervision and Curriculum Development (ASCD). Google Scholar
  • Wilson, K. J., Brickman, P., & Brame, C. J. (2017). Evidence based teaching guide: Group work. CBE—Life Science Education, 17(1), fe1. Google Scholar