ASCB logo LSE Logo

General Essays and ArticlesFree Access

Access to Online Formative Assessments in Lower-Division Undergraduate Biology Courses: Investigating Barriers to Student Engagement

    Published Online:https://doi.org/10.1187/cbe.22-05-0098

    Abstract

    Instructors use a variety of online formative assessment (FA) activities to support learning outside class. Previous studies have revealed barriers for students in online courses, but little is known about the barriers students experience when completing online FA assignments. Understanding these barriers to access is critical to fostering more inclusive learning for all students. Using a framework from previous work in online learning, we examined student perceptions of online FA access with respect to five barrier categories: technical resources, instructor organization, social interactions, personal engagement, and learning environment. We developed and administered a survey to more than 1200 undergraduate biology students at 2-year and 4-year institutions. Students responded to statements using Likert scales and open-ended prompts. Statistical models indicated differences in access across the barrier categories and revealed that demographic characteristics were associated with certain barrier categories. Furthermore, technical resources, instructor organization, and personal engagement barriers were associated with lower course performance. In open-ended responses, students most frequently suggested that changes to scheduling logistics, course delivery, and FA format would improve their online FA experience. We discuss how these findings and student suggestions can inform instruction, particularly how instructors can alter their FA characteristics to better suit their student populations.

    INTRODUCTION

    Formative assessments (FAs) are widely used instructional tools that gauge student learning and thereby provide valuable feedback to students and instructors (Black and Wiliam, 2009; Evans et al., 2014). FAs can take many forms, including quizzes, homework activities, and other question sets (Black and Wiliam, 2009). These assignments are designed with the goal of helping students advance their understanding of course concepts, rather than evaluating students for the purpose of assigning grades (Sadler, 1989). FAs can improve student learning and course performance (Sadler, 1989; Boston, 2002; Black and Wiliam, 2009; Freeman et al., 2011; Eddy and Hogan, 2014), and courses that include regular FA activities have particular benefits for students from underrepresented demographic groups (Freeman et al., 2011; Eddy and Hogan, 2014). Our previous work also has found that students value various types of FAs and recognize a variety of ways that these activities support their learning (Brazeal et al., 2016, 2021).

    The literature consistently highlights several defining features of FA activities (Natriello, 1987; Sadler, 1989; Boston, 2002; Black and Wiliam, 2009; Freeman et al., 2011; Offerdahl et al., 2018). Broadly, FAs are designed to facilitate an iterative process wherein information and feedback are exchanged between the learner and instructor. Based on prior literature, Black and Wiliam (2009) summarized five key goals of an effective FA: 1) clarify criteria for success, 2) elicit evidence of learning, 3) provide useful feedback to students, 4) encourage students to be resources for one another, and 5) prompt students to take control of their own learning. To achieve these goals, instructors first choose or develop an assessment task intended to elicit evidence of student learning that aligns with the associated learning goals (Natriello, 1987; Black and Wiliam, 2009). As students complete the activity, the instructor can use information from student responses to adjust teaching strategies and address student knowledge gaps. Students can also use information from the FA to self-correct and seek out additional learning resources (Black and Wiliam, 2009; Greenstein, 2010). Guided by their instructors, students learn to identify gaps in knowledge, take action to improve understanding, and reach learning goals (Boekaerts and Corno, 2005).

    FAs can take place both inside and outside the classroom setting. In the classroom, FA activities help create an active-learning environment, and instructors have some control over a student’s FA experience, such as by providing instructional cues, creating space for student interactions, and guiding student learning (Hill and Epps, 2010; Lei, 2010; Knight et al., 2013). In light of limited class meeting time, out-of-class assignments represent an essential way to extend the learning experience (Magalhães et al., 2020), and asynchronous assignments can additionally enable students to progress through an activity at their own pace and develop independent learning skills (Baleni, 2015). However, instructors may have more limited ability to monitor and influence the resources and support to which students have access, and the out-of-class learning environment may present students with particular challenges related to their personal circumstances (e.g., financial situation, responsibility for dependents, employment schedule). Given the critical role that FAs play in facilitating learning, we need to further understand how different factors potentially influence student engagement with these activities when completed outside class.

    Online FA Assignments

    Over the past decade, asynchronous online FA administration has become commonplace in undergraduate education (Freeman et al., 2011; Gikandi et al., 2011; Elmahdi et al., 2018). These online assignments can scaffold learning in ways that cannot be achieved with traditional paper activities (Rayner, 2008). For example, online platforms can progressively display more challenging questions as the student becomes familiar with the material and can provide immediate feedback to students, a key component of the FA process (Rovai, 2000; Gaytan and McEwen, 2007). Finally, online FA grading can often be done automatically, leaving more time for the instructor to identify where students are struggling and adjust content accordingly (Boitshwarelo et al., 2017; Alruwais, 2018).

    Unfortunately, online assignments can also present barriers for students who lack Internet access or have limited access to suitable electronic devices (Alruwais, 2018; Khan and Khan, 2019). Some students prefer paper assessments to online assessments, noting concerns with accessibility (Baleni, 2015; Khan and Khan, 2019; Magalhães et al., 2020). Students with a preference for paper assessments report that they have experienced Internet or the software crashes while completing online assignments (Baleni, 2015; Khan and Khan, 2019). Others perceive that instructors have less familiarity with online platforms, resulting in confusion and poor layout of the assignments (Khan and Khan, 2019).

    In addition to student preferences, online learning environments pose accessibility barriers that hinder progress for select groups of students, suggesting a potential link between FA accessibility and certain demographic characteristics (Jaggars, 2011). Studies have found that age, education, income, gender, and race are associated with student-reported access to online courses (Muilenburg and Berge, 2005; Jaggars, 2011; Palmer et al., 2013). Additionally, while first-generation student access to FAs has not been specifically explored, these students may experience more general obstacles to academic success (e.g., job responsibilities, family responsibilities, inadequate study skills, and mental health struggles) than non–first generation students (Stebleton and Soria, 2013). The extracurricular nature of these challenges suggest that they may interface with online FA access. Less is known about the difference in access between community college students and university students, but external responsibilities, such as employment, may disproportionally affect community college student achievement (Bers and Smith, 1991). Thus, while online FAs have several advantages, more work is needed to understand factors that may limit student engagement.

    Theoretical and Conceptual Framework

    Black and Wiliam’s theory summarizes how FAs promote learning by articulating five goals of an effective FA (Black and Wiliam, 2009). This theory, however, provides a description of FAs in an environment with ideal conditions to support learning. Though instructors can regulate some aspects of the FA process (e.g., task design, scheduling logistics, assessment format), there are several factors that typically fall outside an instructor’s direct control (e.g., student access to necessary devices, student self-efficacy, external learning environment). Furthermore, certain demographic groups tend to experience greater educational barriers, suggesting that student engagement with online FAs may be shaped by broader societal contexts. As part of the broader imperative to make learning experiences more inclusive and equitable, we take steps here to identify barriers that affect student participation in online assignments. Our theoretical framework positions FA access within the broader context of personal demographics and course performance (Figure 1). Demographic characteristics can directly affect access, but they may also shape personal circumstances that in turn play a role in online FA access. Online FA access can then affect course performance by influencing a student’s content learning and assignment completion scores. We also note that student demographic characteristics relate to overall course performance in a variety of other complex ways, independent from online FA access. By using the information explored through this framework, we can work to improve access, particularly for students at a systemic disadvantage (Ainscow, 2016). We note that, for the purposes of our study, the term “access” is used to define a student’s resource availability and ability to engage with online FAs. We do not intend inferences to other types of access (e.g., disability access).

    FIGURE 1.

    FIGURE 1. Theoretical framework used to study the relationships between demographic characteristics, access to online FAs, and course performance. We anticipate that demographic characteristics affect FA access, but these demographics may also contribute to personal circumstances that in turn affect access. We also predict that online FA access will relate to course performance. While we recognize that demographics can connect to course performance in other ways, our study specifically focuses on how demographics shape FA access and how FA access relates to course performance.

    Researchers have categorized barriers to online learning in general, primarily in the context of an entire online course and outside science, technology, engineering, and mathematics (STEM) settings (Mungania, 2004; Assareh and Hosseini Bidokht, 2011; Abuhammad, 2020). Muilenburg and Berge (2005) in particular made a key contribution to the knowledge of student access by identifying question categories, survey items, and associated factors that reflect student online learning experiences. Although based on fully online course contexts, their work provides an important basis for understanding various dimensions of the online learning environment. Building on their findings, we explore student access to FA assignments with respect to five barrier categories: technical resources, instructor organization, social interactions, personal engagement, and learning environment. These categories provide a relevant conceptual framework for our investigation to identify barriers that occur when students specifically engage with online FA assignments in the context of in-person courses. Additionally, this framework provides a lens through which to understand how specific barriers relate to demographic factors and course performance.

    Technical Resources

    Significant barriers to online learning stem from the electronic devices and Internet connection required to use online educational materials. Limited access to reliable, fast Internet, suitable electronic devices, and associated software hinders students’ abilities to engage with online learning materials (Muilenburg and Berge, 2005; Abuhammad, 2020). Confidence in using devices and software can also affect students’ experiences with e-learning (Assareh and Hosseini Bidokht, 2011). The level of access to devices and confidence in technology usage varies among students of different genders, races, ethnicities, ages, and income levels (Muilenburg and Berge, 2005; Porter and Donthu, 2006; Palmer et al., 2013).

    Instructor Organization

    Instructors, though not physically present when students engage in out-of-class FAs, play a central role in designing learning materials and communicating expectations (Heuer and King, 2004). Instructor-related factors as a whole, including online course organization, assignment schedule, and communication with students, are associated with student access to online courses (Muilenburg and Berge, 2005). For example, students’ inability to locate the assignments or due dates poses barriers to their online learning experiences (Heuer and King, 2004). Less is known regarding instructor barriers and student demographics, but student cultural backgrounds potentially shape their perceptions of an instructor’s behavior (Levy et al., 1997).

    Social Interactions

    The FA process is supported by various sociocultural constructivist learning theories that suggest that social interactions are essential to building knowledge (Trumbull and Lash, 2013). Similarly, Black and Wiliam’s theory of how FAs promote learning includes the idea that social interactions represent a key component of the FA experience (Black and Wiliam, 2009). Social interactions such as discussion and collaboration among students provide many benefits that lead to improved content understanding and increased achievement (Jung et al., 2002; Soller, 2001; Laal and Ghodsi, 2012). Despite their benefits, peer interactions may not occur readily in the virtual setting, posing a significant barrier to student engagement with remote learning (Muilenburg and Berge, 2005; Becker et al., 2013). Out-of-class assignments, by nature, tend to be less collaborative than in-class activities, because students are not physically surrounded by classmates. As a result, students report participating less in discussions for out-of-class assignments (Brazeal et al., 2016). Demographics and external responsibilities may relate to students’ engagement in peer discussion. In particular, women are more likely to engage in peer learning than men, and students who do not have external commitments like employment are more likely to collaborate than those who work (Sobhanian and Ye, 2016).

    Personal Engagement

    Students’ personal circumstances and tendencies can affect how they engage with online assignments. In-class FAs occur in an environment where the instructor can designate time for completion and encourage participation. Out-of-class assignments, however, require students to be the primary drivers for completing their work (Bates and Khasawneh, 2007; Knowles and Kerkman, 2007), which interfaces with their time management skills and other external commitments. These external commitments (e.g., caring for dependents, employment) vary with demographics, such as age (Compton et al., 2006). Low self-efficacy in completing online course components is also a barrier to online learning (Muilenburg and Berge, 2005), and there are differences in academic self-efficacy based on gender, class rank, economic situation, and perceived academic achievement (Satici and Can, 2016).

    Learning Environment

    The external environment in which students engage with course materials can influence their learning experience. There is little known about this learning environment and its relationship to online course components, but research on in-person learning suggests that students’ learning environments affect their performance and satisfaction (Dorman, 2001; Hill and Epps, 2010). Additionally, the growth of the Internet and enticement of online activities has introduced digital distraction and “cyber-slacking” to the list of factors affecting student engagement (Aagaard, 2015; Attia et al., 2017; Flanigan and Kiewra, 2018). Students may be tempted to browse social media, respond to a message, or multitask by watching videos online when completing an online FA. Engaging with materials in out-of-class settings, especially when an electronic device is required to complete an assignment, may make students particularly prone to distraction. When completing online assignments, students may also be forced to move out of their typical learning environments to seek an Internet connection, and academic self-efficacy can vary across different location types (e.g., school, home, café; Zhao et al., 2010).

    Study Rationale

    Though several studies have investigated barriers to online learning in the context of fully online courses, we have little information regarding the access and barriers that students experience when completing FA assignments, which represent a critical course structure supporting out-of-class learning. Similarly, previous research has identified demographic characteristics that may affect online learning, though not specifically in relation to online FAs. Thus, in the present study, we sought to characterize the extent to which students perceived that they had access or faced barriers (i.e., lacked access) concerning online FA assignments in the context of traditional in-person courses. We were guided by four primary research questions relating to student online FA access:

    1. How commonly do students report that they experience barriers to online FAs?

    2. Do student demographic factors relate to particular perceived barriers?

    3. Do perceived barriers relate to course performance?

    4. According to students, what can instructors do to lower perceived barriers?

    We developed a survey instrument to address these research questions in lower-division undergraduate biology courses. In addition to providing data for the research investigation, this instrument will also provide a way for instructors to gauge the presence of key barriers in their courses, identify students or groups of students in need of additional supports, and collect suggestions from students about changes that could improve their online FA experiences.

    METHODS

    Survey Development and Piloting

    We leveraged materials and findings from the Muilenburg and Berge (2005) study as a starting place for survey development. We structured our survey around five categories potentially affecting student access to online FAs: technical resources, instructor organization, social interactions, personal engagement, and learning environment. For each category, we adapted questions from Muilenburg and Berge (2005) and drafted additional questions to support the content validity of each category (i.e., the extent to which the questions cover the full range of a category). Among the authors, we iteratively revised the questions to ensure that the items were clearly worded, targeted online FA assignments, used updated technical terminology, and could apply across different courses.

    Each category initially contained eight to 10 positively worded items (e.g., “I have regular access to an appropriate device, such as a laptop or tablet, to complete [the FA]”) for a total of 43 draft survey items. The portion in brackets for each question was replaced with the name of the activity used in a given course in order to use labels that would be familiar to students (e.g., “homework quiz”). Students responded to these items on a seven-point Likert-type agreement scale and completed a demographics questionnaire at the end. The survey items were all positively worded, meaning that a high level of agreement indicates high access. When interpreting survey responses, we conceptualize “access” and “barriers” as being inversely related: students reporting high access means they have few barriers, and students reporting low access suggests they have high barriers for the given item or category.

    We targeted students in lower-division (100-level and 200-level) biology courses taught by eight instructors at one 2-year and one 4-year institution (Table 1). We surveyed students at different institution types to better understand the barriers faced in multiple settings, not necessarily as a basis for comparison. The pilot survey was administered online via Qualtrics (Qualtrics, 2020) during the last 3 weeks of the Fall 2020 semester. Students were offered a small amount of either regular course credit or extra credit for survey participation. In prepandemic semesters, these courses were all considered “traditional” courses with regular in-person class meetings. Due to the COVID-19 pandemic, these courses were administered through a variety of methods, including in-person, hybrid, and remote formats.

    TABLE 1. Survey administration institution information

    Institution typeInstructorsStudents% of sample
    Fall 2020 (pilot)
     4-year institution, Midwest569793
     2-year institution, Midwest3527
    Spring 2021 (final)
     4-year institution, Midwest5100379
     2-year institution, Midwest419816
     2-year institution, Pacific Northwest2615

    Data Processing

    For students who consented to share their data for research purposes, we processed the data by first calculating survey completion times. If a participant sat idle on a page for more than 20 minutes, we replaced the idle time with the average time for the page. We then removed duplicate attempts, submissions less than half-completed, or surveys submitted in fewer than 3 minutes. The 3-minute cutoff was determined based on the distribution of submission times and an estimate of the minimum time needed to read and answer the questions. This initial pilot resulted in 749 usable responses (representing 65% of course enrollment).

    Survey Revisions

    We used the pilot data to refine our survey for the subsequent administration. To investigate the degree to which our items aligned with the different survey categories, we ran a confirmatory factor analysis (CFA) on the pilot data using the lavaan package in R (Rosseel, 2012) and found that the resulting model had promising but below adequate fit statistics on some indicators (comparative fit index (CFI) = 0.846, Tucker-Lewis index (TLI) = 0.837, root mean square error of approximation (RMSEA) = 0.071, standardized root mean square residual (SRMR) = 0.072). Based on the results, we removed survey items with low factor loadings and limited relevance to the categories. After these changes, we were left with a survey composed of six items per category for a total of 30 survey items. After removing the items, a second CFA including only data from the remaining 30 items suggested improved fit (CFI = 0.906, TLI = 0.896, RMSEA = 0.067, SRMR = 0.060).

    In addition to removing select Likert-scale items, we also added an open-ended section to be included in the final survey. Students included in the final survey administration were shown a randomly selected survey item along with their previously selected Likert response to that item. For that item, students were then given two open-ended prompts. Prompt 1 assessed student interpretation of the item to determine whether it was correctly understood. Prompt 2 asked students about related supports they would like to see implemented within their courses or institutions to help alleviate potential barriers to online FAs, for which they may have made suggestions closely aligned with the specific survey item or addressing another area that somehow related to the item. These prompts were repeated for a second randomly selected survey item for each student. Given the random nature of this section, students may have seen items with which they either agreed or disagreed. This allowed us to compare responses for students who provided open-ended explanations on items that were barriers to students who did not experience barriers.

    After refining the survey, we conducted eight student interviews to determine whether students were interpreting the survey items as intended. Student volunteers from an undergraduate genetics course (not included in the survey samples) met with the researcher via videoconferencing. During these 60-minute think-aloud interviews (Anders and Simon, 1980), participants explained their understanding of the survey items, talked through their reasons for selecting a response, and indicated unclear areas. Based on interview data from the eight participants, we adjusted item wording before the final survey administration.

    Final Survey Administration

    The final survey was administered during the last 3 weeks of the Spring 2021 semester (see full survey in Supplemental Table 1). We surveyed students in an expanded group of lower-division biology courses taught by 11 instructors at three institutions, including both institutions from the pilot study and an additional 2-year institution (Table 1). COVID-19 restrictions remained in place during this semester, so these courses were again administered through a variety of formats. Self-reported demographic information for the survey participants can be found in Table 2, separated by institution type in Supplemental Table 2. The survey format was the same as the pilot survey: students responded to Likert-type survey questions, the two new open-ended items, and a demographic questionnaire via Qualtrics. We processed the data according to the same criteria outlined for the pilot survey. Processing removed 203 responses due to 101 individuals who did not consent to share their data for research purposes, two rapid responses (<3 minutes), and 100 duplicate and/or incomplete responses. After data processing, we were left with 1262 usable responses (representing 79% of course enrollment; response rates were similar across institution types).

    TABLE 2. Final survey participant demographic information

    na% of sample
    Genderb
     Female90672
     Male33026
     Self-describe50.4
    Race/ethnicity
     Non-URM110983
     URMc17713
     Self-describe110.9
    Class rank
     First-year55744
     Sophomore31425
     Junior21117
     Senior1018
     Postbaccalaureate343
     Graduate student80.6
     Other181
    First-generation status
     Not first generation72457
     First generation50540
    Language spoken at home
     English110487
     Other14211
    Career plan
     Life sciences94775
     Other29523
    Institution type
     2-year25921
     4-year100379

    aNumbers do not add to full sample size, because some students left the given item blank.

    bThose who self-described their gender all identified as nonbinary.

    cUnderrepresented racial/ethnic groups included participants who self-identified as African American/Black, Hispanic/Latinx, Native American/Alaska Native, or Native Hawaiian/Pacific Islander.

    Calculation of Access Scores

    Likert-scale responses were converted to numerical data for analysis (strongly disagree = 1, disagree = 2, somewhat disagree = 3, neither agree nor disagree = 4, somewhat agree = 5, agree = 6, strongly agree = 7). For each category, we calculated the mean Likert score across the items for a given student (referred to as an access score). The relationship between access and barriers is inversely proportional. For example, a calculated access score of 6.5 would indicate high access/few barriers; a score of 2.5 would indicate low access/high barriers. Students with an access score greater than 4.5 were considered to have high access/few barriers, whereas students with access scores less than or equal to 4.5 were considered to have low access/high barriers for the category. We similarly classified student access into two groups for each item: no barrier reported (i.e., students who agreed with the item to some degree) or barrier reported (i.e., students who disagreed or chose the neutral option). In both cases, we included the neutral range or option as reflecting an underlying barrier, because these responses were commonly associated with some degree of impaired access in student open-ended responses, even if only slightly.

    Statistical Analyses

    Using the final data set, we began by conducting a CFA to determine whether the factor structure recapitulated what we found in our pilot work. We calculated Cronbach’s alpha for each category by performing scale reliability analysis in SPSS (IBM, 2020). We calculated Pearson correlations between all pairs of barrier categories to understand the degree of correspondence across categories. We ran a linear mixed-effects model in JMP Pro 15 (SAS Institute, 2020) to detect differences among the five barrier categories. We included the five barrier categories as separate independent variables along with student nested within instructor as random effects and access score as the dependent variable. Post hoc Tukey tests were then conducted between all pairs of barrier categories.

    We next sought to understand the relationship between student demographics and reported barriers. As outlined in the Introduction, previous work on barriers to online learning identified a variety of connections to demographic characteristics, and our prior studies found that student buy-in to certain FA activities also tracked with underlying demographic variables (Choy, 2001; Alexander et al., 2009; Hurtado et al., 2009; Haak et al., 2011; Brazeal et al., 2016; Estrada et al., 2016; Matz et al., 2017). Given the broader patterns linking demographic attributes to course performance and program persistence (Choy, 2001; Alexander et al., 2009; Matz et al., 2017), we sought to identify demographic patterns that might shape online FA access. Given the moderate correlations between barrier categories, the mixed-effects models were conducted separately for each barrier category. Additionally, we ran these same mixed-effects models separately for each institution type to understand how demographic factors may relate to access in different settings.

    Finally, we wanted to determine whether there was an association between online FA access and course performance (i.e., final course percent grade) as a means to gauge potential connections to a relevant academic outcome. Course performance data were provided by instructors. We estimated five separate mixed-effects models, each containing a different barrier category as an independent variable, controlling for demographics and including instructor as a random effect in order to study relationships with the dependent variable of overall course performance. Additionally, we estimated a model to determine whether the number of barrier categories with sufficient access per student related to course performance, again controlling for demographics and including instructor as a random effect.

    Open-Ended Coding

    We developed a codebook for each of the two open-ended prompts (Supplemental Tables 3 and 4). The prompt 1 coding was designed to identify whether students were interpreting the survey items as intended and included three levels of interpretation (i.e., intended, ambiguous, or unintended interpretation). The prompt 2 coding sought to classify student suggestions and also included three primary levels (i.e., specific suggestion for improvement, affirmation of a current practice, or statement of no specific suggestion). For both prompts, we included an additional two codes that accounted for students who provided an entirely off-topic response or did not respond. To apply these codes, five authors (A.M.U., D.L.KW., K.R.B., L.A.W., B.A.C., G.B.J., and S.K.S.) conducted an initial two practice rounds to refine the codebooks. Then, groups of three to five researchers separately co-coded batches of five responses for each item until all 30 items reached 80% agreement across two consecutive rounds, at which point one author (A.M.U.) applied all codes to the remaining prompt 1 items.

    For prompt 2, we conducted a parallel process to capture more detailed information about student suggestions. During the initial coding rounds, the same five authors captured key phrases from student responses that were given a primary code (i.e., responses that were not off-topic or blank). One author (A.M.U.) then read through these key phrases and used an inductive process to develop an initial set of response categories, which was reviewed by the larger group. The one author then applied these categories to another small subset of responses, and changes were made where necessary in consultation with the larger group. Once the categories were deemed to reasonably capture the range of student suggestions, the same author coded all remaining responses.

    This research was granted institutional review board approval by all three institutions involved in research and data collection.

    RESULTS

    Instrument Characteristics

    We conducted a CFA to determine whether the five barrier categories represented discernible factors. The results from this analysis provided support for our five-factor model (CFI = 0.909, TLI = 0.899, RMSEA = 0.067, SRMR = 0.059), and all of the items had adequate loadings onto their respective factors (Table 3). The Cronbach’s alphas (i.e., measures of internal reliability) for each scale were strong, ranging from 0.80 to 0.95. Based on the CFA, the five categories can be considered distinct factors, yet we also found that they had some degree of relation, with Pearson correlations between categories ranging from 0.150 to 0.435 (Table 4).

    TABLE 3. CFA of final survey, using principal component analysis extraction and Promax rotation

    Components (30)Factor loadings
    12345
    Technical resources (α = 0.95)Access to device0.891
    Access to software0.921
    Access to Internet in residence0.769
    Comfort with devices0.884
    Ease of software use0.869
    Convenience of access to Internet0.847
    Instructor organization (α = 0.91)Access to instructor or TA for FA help0.852
    Responsiveness of instructor or TA to questions0.789
    Comfort contacting instructor or TA0.807
    Ease of locating FA due date(s)0.683
    Clarity of FA instructions0.778
    Guidance on online FA delivery system0.809
    Social interactions (α= 0.91)Usefulness of other students as a resource for help on FA0.854
    Comfort reaching out to other students for help on FA0.813
    Interactions with classmates in other parts of the course fostering interactions when working on the FA0.814
    Working with classmates on the FA fostering study groups0.833
    Support from people outside the course0.634
    Interaction among students when completing the FA0.791
    Personal engagement (α = 0.87)Adequate time to complete the FA0.811
    Time for personal responsibilities after completing the FA0.784
    Responsibility for getting the most out of FA0.603
    Following through on plans to complete the FA0.747
    Priority to complete FA0.665
    Ease of getting started on FA0.712
    Learning environment (α = 0.80)Multitasking while completing FA0.704
    Focus despite notifications on devices0.698
    Focus despite other online activities0.736
    Devices on silent while completing FA0.558
    Complete FA without interruptions0.566
    Television while completing FA0.567

    TABLE 4. Barrier category Pearson correlation matrix

    FactorTechnical resourcesInstructor organizationSocial interactionsPersonal engagementLearning environment
    Technical resources1
    Instructor organization0.3231
    Social interactions0.1350.1501
    Personal engagement0.4010.3880.2051
    Learning environment0.2140.2790.2210.4351

    We analyzed open-ended explanations of students’ answer selections (prompt 1) to gauge item interpretation (Table 5). We found that, in most cases (86%), students understood the question as intended, whereas relatively few (2%) explanations indicated a misinterpretation of the item. Taken together with student interviews, these results provide additional support for the validity of student Likert responses as a reflection of their perceptions related to each item.

    TABLE 5. Student responses to open-ended prompt 1

    “You responded that you [Likert response] with the following statement: [Item text]. In 1–2 sentences, please explain why you [Likert response] with the statement.”
    Response typena% of responses
    Intended item interpretation218286
    Ambiguous response1737
    Unintended item interpretation612
    Unrelated/random response141
    Answered “N/A”301
    Left blank643

    aNumbers exceed sample size, because each student completed this prompt twice.

    How Commonly Do Students Report That They Experience Barriers to Online FAs?

    To investigate barriers that students face when engaging with online FAs, we plotted the distribution of mean Likert scores (access scores) for each barrier category. Our mixed-effects model and associated post hoc tests suggested that there were differences (p < 0.001) between all barrier categories, except for instructor organization and personal engagement, with students citing the fewest barriers in the technical resources category and the most barriers in the social category (Figure 2 and Supplemental Table 5). We also found that the social category had the broadest range of responses, as roughly half of all students had neutral to disagree-type responses. In contrast, nearly all students responded with agree-type responses in the technical resources category. While these results indicate barriers within categories, we also wanted to get a sense of the degree to which individual students faced barriers across multiple categories. Thus, we analyzed the distribution and number of categories for which students indicated they had sufficient access and found that the majority of students reported sufficient access in at least four of the five categories (Figure 3).

    FIGURE 2.

    FIGURE 2. Score distributions for the five barrier categories. Central bars represent category median score, “×” represents category mean, boxes represent inner quartiles, whiskers represent minimum/maximum scores up to 1.5 times the interquartile range (IQR), and outliers represent scores outside 1.5 × IQR. N = 1262 student participants. See Supplemental Table 5 for full model statistics. Barrier categories not sharing the same letter are significantly different from each other in post hoc analysis.

    FIGURE 3.

    FIGURE 3. Distribution of barrier categories with sufficient access. Bars represent the percentage of students who have sufficient access in the given number of barrier categories. Students were considered to have sufficient access to a category if they had an access score greater than 4.5, reflecting mostly “strongly agree,” “agree,” and “somewhat agree” responses to the underlying items. The x-axis values represent number of barrier categories and do not correspond to specific categories. For example, a student with sufficient access in one category could have identified access to any one of the five categories.

    To more specifically identify the possible barriers present within each category, we analyzed for each item the percent of students who did not agree with the statement (i.e., they lack access to some extent). The results for all 30 items can be found in Supplemental Table 1, and we report here the item that emerged as the most common barrier in each category (Table 6). Within the technical resources category, 7% of students reported that they do not have a reliable Internet connection where they live. In the instructor organization category, 19% of students cited that their instructors or teaching assistants (TAs) are not generally responsive to questions. Participants reported multiple barriers in the social interactions category, with the most prevalent being the inability to work with other students and form study groups (55%). Within the personal engagement factor, some students (18%) disagreed that it is easy to get started on the FA. Finally, in the learning environment category, almost half of the participants (49%) responded that they do not put their electronic devices on silent when completing an online FA. With respect to student access across items, we found that the majority of students had sufficient access to at least 75% of the 30 survey items (Figure 4).

    TABLE 6. Item most commonly reported as a barriera in each category

    Itemn%
    Technical resources
     I have a reliable Internet connection where I live that enables me to complete the FA.837
    Instructor organization
     My main point of contact, such as the instructor or teaching assistant, is generally responsive to questions about the FA.24019
    Social interactions
     Working with other students on the FA has helped me form study groups for exams or other parts of the course69855
    Personal engagement
     I find it easy to get started on the FA23318
    Learning environment
     I put my other electronic devices on silent while I am completing the FA62049

    aAn item was considered a barrier if students selected “strongly disagree,” “disagree,” “somewhat disagree,” or “neither agree nor disagree” for the item.

    FIGURE 4.

    FIGURE 4. Distribution of barrier items with sufficient access. Bars represent the percentage of students who have sufficient access to the items. Students were considered to have sufficient access to an item if they selected “strongly agree,” “agree,” or “somewhat agree,” for the item. The x-axis values represent number of survey items and do not correspond to specific item numbers. For example, a student with sufficient access to 10 items could have identified access to any 10 of the 30 survey items.

    Do Student Demographic Factors Relate to Particular Perceived Barriers?

    We examined the degree to which student demographics related to their access to online FAs. By estimating five separate mixed-effects models, we examined the relationships that these demographic attributes had with each of the five factors (Table 7). For gender, results suggested that there was no association with any of the barrier categories. The race/ethnicity results suggested associations with all categories, except technical resources, although the directions of these associations varied by category. Underrepresented racial/ethnic minority (URM) students reported higher instructor organization, personal engagement, and learning environment access than did non-URM students. Conversely, non-URM students reported higher access to social interactions than did URM students. Class rank did not commonly have a relationship with online FA access, but sophomores and seniors had higher access to social interactions than did first-year students. First-generation status related to the personal engagement category, with first-generation students experiencing more barriers in this area. Language spoken at home was related to technical resources, instructor organization, and personal engagement. Students who spoke English as a first language had higher access in these three categories than did those speaking other languages. Career plan was related to personal engagement barriers, with students planning to pursue a life sciences career having fewer barriers than those pursuing a different career field. Finally, we found that institution type (2-year or 4-year) only related to the instructor organization category, and students at a 2-year institution reported higher access to instructor organization than did those at a 4-year institution.

    TABLE 7. Mixed-effects modelsa to determine relationships between demographics and barrier scores

    Independent variablesDependent variables
    Technical resourcesInstructor organizationSocial interactionsPersonal engagementLearning environment
    DemographicsbEstimate ± SEc
    Gender (reference: female)
     Male−0.091 ± 0.0590.00 ± 0.050.00 ± 0.10−0.10 ± 0.060.05 ± 0.08
     Self-described−0.399 ± 0.881−1.00 ± 0.73−0.27 ± 1.41−1.19 ± 0.92−1.62 ± 1.11
    Race/ethnicity (reference: non-URM)
     URMe0.024 ± 0.0800.15 ± 0.07−0.26 ± 0.130.19 ± 0.080.21 ± 0.10
     Self-describe0.012 ± 0.298−0.10 ± 0.250.19 ± 0.480.19 ± 0.310.39 ± 0.38
    Class rank (reference: first-year)
     Sophomore−0.241 ± 0.2120.01 ± 0.180.91 ± 0.340.15 ± 0.220.08 ± 0.27
     Junior−0.149 ± 0.214−0.05 ± 0.180.60 ± 0.350.08 ± 0.22−0.03 ± 0.27
     Senior−0.178 ± 0.2170.09 ± 0.180.75 ± 0.350.01 ± 0.230.08 ± 0.28
     Postbaccalaureate−0.171 ± 0.227−0.07 ± 0.190.55 ± 0.370.00 ± 0.24−0.04 ± 0.29
     Graduate student−0.014 ± 0.2620.10 ± 0.220.06 ± 0.420.05 ± 0.280.31 ± 0.33
     Other0.053 ± 0.395−0.16 ± 0.33−0.01 ± 0.640.48 ± 0.42−0.28 ± 0.50
    First-generation status (reference: not first-generation)
     First generation−0.038 ± 0.0570.01 ± 0.050.04 ± 0.09−0.15 ± 0.060.07 ± 0.07
    Language (reference: English)
     Non-English−0.470 ± 0.091−0.28 ± 0.080.17 ± 0.15−0.32 ± 0.10−0.14 ± 0.12
    Career (reference: life sciences)
     Non–life sciences−0.100 ± 0.061−0.02 ± 0.05−0.01 ± 0.10−0.13 ± 0.07−0.06 ± 0.08
    Institution type (reference: 4-year)
     2-year−0.026 ± 0.0750.41 ± 0.170.08 ± 0.350.19 ± 0.200.31 ± 0.15

    aBarrier category score ∼ gender + race/ethnicity + class rank + first-generation status + language + career +institution type. Instructor = random effect. Predictors were included in separate models for each barrier category. Significant relationships (p < 0.05) are in bold.

    bReference categories were selected based on the group with the most students.

    cEstimates indicate the effect based on being a member of the focal group in comparison to the reference group.

    dThose who self-described their gender all identified as nonbinary.

    eUnderrepresented racial/ethnic groups included participants who self-identified as African American/Black, Hispanic/Latinx, Native American/Alaska Native, or Native Hawaiian/Pacific Islander.

    While our study did not primarily seek to analyze differences between 2-year and 4-year institutions, we ran additional mixed-effects models to determine how demographic characteristics related to access at each institution type (Supplemental Table 6). We found that the URM demographic only associated with learning environment at the 2-year institutions, whereas this characteristic only related to personal engagement at the 4-year institution. Furthermore, we found that the connections between online FA access and class rank, first-generation, and career plan were primarily driven by students at the 4-year institution. Findings for the language demographic were robust to institution type: Students who did not speak English at home had lower access at both institution types for technical resources, instructor organization, and learning environment.

    Do Perceived Barriers Relate to Course Performance?

    We next estimated five regression models to identify associations between online FA barriers and course performance (Table 8 and Supplemental Tables 7–12). Accounting for demographics and instructor, we found that technical resources, instructor organization, and personal engagement related to course performance, with higher levels of access associated with higher course performance. The social interactions and learning environment barrier scores did not relate to course performance, even though these categories posed the most barriers to students. In an additional model, there was no association between course performance and the number of categories in which students had sufficient online FA access (Supplemental Table 13).

    TABLE 8. Summarized results from mixed-effects models investigating the relationship between barrier scores and course performance, accounting for demographics and instructora

    Model predictorsOutcome variable: course performance
    Barrier categoryEstimate ± SE
    Technical resources1.312 ± 0.358
    Instructor organization0.937 ± 0.411
    Social interactions−0.295 ± 0.212
    Personal engagement2.039 ± 0.314
    Learning environment0.315 ± 0.266

    aSignificant relationships (p < 0.05) are in bold. See Supplemental Tables 5–10 for full model results.

    According to Students, What Can Instructors Do to Lower Perceived Barriers?

    We analyzed student open-ended suggestions (prompt 2) regarding ways to alleviate barriers (Table 9). Many students recommended specific changes, for example, “I think the instructor should try to make some group activity/assignments to build connections.” Others provided positive feedback on already occurring supports that benefit their FA experience, such as “I think my instructor has done a great job setting due dates far enough apart and gives us lots of time for preparation.” The majority of students gave no suggestion or indicated that the current situation was generally supportive, with responses such as “I don’t think I need anything else to support me,” or “The instructor is doing a good job as is.” Finally, some student responses provided no discernible information, such as when they were completely off-topic or left an answer blank.

    TABLE 9. Student responses based on barrier status to primary coding of open-ended prompt 2

    “You responded that you [Likert response] with the following statement: [Item text]. Is there something your instructor or institution could do to support you in regard to this statement? Please explain in 1–2 sentences.”
    Response typen shown a barriera% of total barrier responsesn not shown a barrier% of total nonbarrier responses
    Something to improve/change1503125715
    Something that should continue92966
    No specific suggestions/nothing can be done25152114567
    Unrelated/random response36780.5
    Answered “N/A”41821512

    aItems shown to students were random, and therefore students may or may not have seen one of their barriers. An item was considered a barrier if students selected “strongly disagree,” “disagree,” “somewhat disagree,” or “neither agree nor disagree” for the item. Numbers exceed sample size, because each student was asked to complete this prompt twice.

    To gain more insight into student suggestions and give instructors more comprehensive student feedback, we also developed and applied a more detailed set of codes for any student responses that included discernible information. Our analysis revealed a variety of specific suggestions (Table 10). Overall, students suggested many changes in scheduling logistics (13%), course delivery method (11%), and the FA format (11%). Comments about scheduling logistics mostly referred to due dates: “Maybe spread out due dates throughout the unit. That way students aren’t so worried about everything being due on one day,” and “Working a job while going to school is sometimes hard. I wish the due dates were Sundays or before weekends.” Students also expressed a desire to have traditional in-person courses, which likely stemmed directly from the various accommodations due to the ongoing COVID-19 pandemic: “In person classes would have helped with this. Obviously COVID didn’t allow for that. There could have been some sort of optional virtual study groups put together.” There were several suggestions regarding the FA format, including the length and style of the assignment: “Make the assignments shorter, I think sometimes it’s a lot at once.”

    TABLE 10. Open-ended prompt 2 codebook with student responses

    “You responded that you [Likert response] with the following statement: [Item text]. Is there something your instructor or institution could do to support you in regard to this statement? Please explain in 1–2 sentences.”
    CodeDefinition: The instructor and/or institution could…Example student quotes% of responsesa
    Technical
     Devices, Internet, or other resourcesprovide students with the necessary devices, software, Internet access, technical support, or other resources.“For those that don’t have access, having devices for students would be super important.”7.0
    Formative assessment
     Scheduling logisticsalter the due date schedule, assignment frequency, and/or provide more time to complete the FA.“Maybe spread out due dates throughout the unit. That way students aren’t so worried about everything being due on one day.”13.0
     Requirements/instructionsclarify the requirements and instructions for the FA.“More instructions on what we need to have in our [FA] would be nice.”6.0
     FA alignment with lecturebetter align the content on the FA with the content covered in lecture.“I think that making sure that the lecture closely aligns with the quiz timing will help students.”2.0
     FA contentalter the content or complexity of content that appears on the FAs.“The pre-lecture quizzes sometimes test more in-depth ideas than what is ever addressed in the lecture.”4.0
     FA formatalter the format, including question style or number of questions, of the FA.“The problems that require you to watch a video take much longer to complete. Maybe limiting the number of these kinds of questions would be beneficial.”11.0
     Grading criteriaalter the criteria used to grade the FA.“24-hour grace periods would be nice, so like you can still turn it in without deduction up to 24 hour after the original due date but it will still get marked late and will be last graded.”3.0
    Instructor qualities
     Flexibility/understandingbe flexible and understanding when issues completing the FA arise.“The instructor could be aware of possible technological issues students might be facing and respond accordingly to each individual case.”2.0
     Friendliness/approachabilitybe friendly and easy to talk to when issues or questions arise.“Maybe the instructors could be more friendly and open when it comes to addressing student’s questions in class.”4.0
     Instructor knowledgebe more knowledgeable about the course content and software used in the course.“Please teach a class to the other educators at [institution] how to use technology.”0.4
     Consistent communication with studentsinform students when things such as due dates, content schedule, or assignment formats change.“Every time there has been a sudden change, my instructor has been quick to inform the class as to why it happened.”8.0
     Office hoursalter office hour schedule and/or frequency.“Make office hours more accessible.”1.0
    Collaboration
     Create study groupsform study groups for students to meet with for help on the FA.“Creating study groups for the class.”4.0
     Peer interaction/group activitiesalter the frequency and/or quality of peer interaction and group activities.“Have more interactions during class. For Zoom, small breakout rooms (more than 2 people). In-person, making people talk to their neighbors about a topic or clicker question.”6.0
     Encourage collaborationencourage students to use classmates as a resource for help on the FA.“Maybe just to encourage working together on the pre-lecture reading quizzes.”2.0
     Way to contact or meet other studentsprovide ways for students to contact or meet each other so they can reach out for help when needed.“Maybe facilitate a place outside of campus where students could plan on going to and meet people in the class. This could be like meet at the mill anytime between 5–7 if you want to work with others to study.”4.0
    Course and instruction
     Feedbackalter the speed, depth, or amount of feedback provided on FAs.“Give better feedback when points are taken away on assignments.”3.0
     Provide general guidance/study tipsprovide guidance on how to best approach the course, including study tips.“I think the instructors could maybe explain why it’s important to minimize distractions while doing schoolwork.”3.0
     Course deliveryalter the way the course is delivered, either in person or remotely.“Have the class in person.”11.0
     Course pacealter the pace at which content in the course is covered.“It’s just so fast paced that I get lost in lecture frequently.”1.0
     Materialsprovide helpful or more plentiful materials for understanding FA content.“The instructor can give us more opportunities and add additional resources that would help us learn the concepts.”2.0
     Instruction contentalter the content covered in the course.“I think this course could be made easier, it is very advanced for an introductory class to biology. I think there are also so many concepts and less information would help.”2.0
    Other
     Study environmentprovide accessible study spaces with adequate Internet.“Make sure spaces are available at the school to work on homework with the Internet for people who do not have stable WiFi.”1.0
     Support for underrepresented studentsoffer additional supports for underrepresented students.“Perhaps, create more foundations/scholarships/programs for students like me. It can really take the weight off [our] shoulders.”0.2

    an = 512 student responses.

    In addition to these overall results, we also explored the open-ended student feedback based on whether or not the responding student had previously indicated that the displayed item represented a barrier. The distribution of student response types (i.e., specific suggestion for improvement, affirmation of a current practice, or statement of no specific suggestion) tracked to some degree with their prior views of the item (Table 9), and a chi-square test of independence suggested a significant association between barrier status and response type, χ2 (4, n = 2208) = 173.291, p < 0.001. Due to the overall high access levels, most students (79%) were shown an item that they did not view as a barrier, and these students tended to endorse an existing practice or give no suggestion. In 21% of cases, student were shown an item that they had disagreed with earlier in the survey. In these situations, students were more likely to make a specific suggestion for how to overcome the barrier, although they also often had no suggestion.

    When students were giving feedback on an item they did not report as a barrier, they most commonly suggested changes to the FA format, communication with students, and course delivery (Figure 5). Conversely, in cases in which students had viewed the item as a barrier, they primarily mentioned scheduling logistics; devices, Internet, or other technical resources; and FA format. Many of the student suggestions provided tractable ways that the instructor might help improve assignment access. For example, a student from a 2-year college facing a social barrier suggested, “If study groups are something that you would see as helpful for your students, assign them groups, and then they can figure out what time works best to all meet but we need that initial push.” Additionally, a 4-year university student experiencing an instructor organization barrier responded, “The [due date] time should not change in the semester. The [due date] time should be at specific time as stated in syllabus.” While we did not notice any marked differences between the types of suggestions and the different institution types (Supplemental Table 14), some suggestions appeared to be more common for certain instructors (Supplemental Table 15).

    FIGURE 5.

    FIGURE 5. Types of suggestions provided to prompt 2. Prompt 2 read: “You responded that you [Likert response] with the following statement: [Item text]. Is there something your instructor or institution could do to support you in regard to this statement? Please explain in 1–2 sentences.” An item was considered a barrier if the student had previously selected “strongly disagree,” “disagree,” “somewhat disagree,” or “neither agree nor disagree” for the item. Total number of responses = 512.

    Finally, we analyzed which suggestion types emerged from the different barrier categories (Figure 6). We found that items in the technical resources category elicited suggestions regarding technical resources and that the social interactions category produced suggestions related to collaboration and course instruction (mostly reflecting a desire to return to nonpandemic conditions). The personal engagement category most notably led to responses about the FA itself, including its structure, content, and policies (particularly due dates). The instructor organization and learning environment categories also had recommendations for the FA itself, with instructor organization also eliciting a number of responses related to instructor qualities, such as friendliness and approachability. While some responses aligned with the respective barrier categories, students often offered suggestions beyond a specific item or category, thus revealing how any particular barrier might be addressed through a variety of different ways.

    FIGURE 6.

    FIGURE 6. Prompt 2 responses by barrier category. The full codebook can be found in Table 10. Total number of responses = 512.

    DISCUSSION

    Identifying barriers to student engagement represents an important step toward increasing student online FA access. Sufficient access represents a precondition necessary for FAs to meet the five objectives outlined by Black and Wiliam (2009). Additionally, identifying groups of students at increased risk for certain barriers provides insight into how instructors can create a more equitable out-of-class experience. Using an existing framework for identifying barriers for online courses as a whole (Muilenburg and Berge, 2005), we developed a survey to examine access to online FAs with respect to the following areas: technical resources, instructor organization, social interactions, personal engagement, and learning environment. By measuring barriers to online FAs, we aimed to expand on prior work, while also providing more specific information that might help instructors adapt their assignments to better support student engagement. This framing allowed for the investigation of student engagement with out-of-class assignments within a traditional course structure. Overall, our results suggest that these barriers exist for online FA activities, that some demographic groups may be more at risk for certain barriers, and that select barrier categories relate to course performance.

    Using open-ended items, we gathered student suggestions for ways to improve the online FA experience through specific critiques of instructor practices such as “Give better feedback when points are taken away on assignments.” However, students also identified areas where they view the barrier as their own responsibility, with comments like “There is not anything my instructor can to do help. These are decisions I made as a student and can’t necessarily be prevented by the instructor.” Some instructors may also believe that addressing student barriers is out of their control or not part of their responsibility (Flanigan and Babchuk, 2022). This instrument serves as a link between these two perspectives, providing a mechanism for gauging student circumstances and finding ways to address barriers and optimize online FAs. Ultimately, the feedback and changes prompted by this information can help cultivate a more equitable learning experience.

    Barriers to Online Learning Occur in the Context of Online FAs

    Previous literature has characterized barriers in fully online courses, and we observed that these barriers also exist in the context of online FAs. Students suggested different levels of access within each of the five barrier categories (Figure 2). These findings are consistent with previous work in online learning access suggesting that some factors may hinder access more than others (Muilenburg and Berge, 2005), but our work highlights that a majority of students report sufficient access within these categories. For the students responding to the survey, Internet access was not a factor largely impeding access to online learning (Muilenburg and Berge, 2005; Assareh and Hosseini Bidokht, 2011). Because the technical questions asked about convenience and reliability as a dimension of access, the results also suggest that students do not struggle with these additional aspects of technical resources. Instructor organization, personal engagement, and learning environment had lower mean Likert scores, but more than 75% of students had a mean access score indicating reasonable agreement. Finally, social interactions presented the most issues, with only 50% of students agreeing that they have adequate access with respect to online FAs.

    While we found that students commonly have sufficient access to online FAs, we recognize the importance of the students who reported more limited access. These students may be calling important attention to assignment features that the instructor does not realize present a problem for students. For example, one student noted struggling with software use, stating, “I just remember being confused and getting frustrated.” These students may also be reflecting challenges stemming from their personal circumstances, such as “I complete the quizzes at home and I can’t control what distractions are going on at my house because I live with other people.” This feedback can help instructors appreciate the array of obstacles faced by their students and motivate additional consideration regarding how a course might meet student needs.

    Connections between Student Demographics and Online FA Access

    Our survey results revealed that some demographic factors related to online FA barriers (Table 7). Similar to many studies in online barriers, we collected students’ gender and racial/ethnic identities. However, unlike previous work (Muilenburg and Berge, 2005; Palmer et al., 2013; Satici and Can, 2016; Sobhanian and Ye, 2016), we did not find that gender predicted any of the five barrier categories. In other STEM fields, such as computer science, gender can play a role in assignment preferences and time spent (Wilson, 2006), but our results suggest that gender does not have a strong connection with FA buy-in or access (Brazeal and Couch, 2017), perhaps reflecting differences in student experience across STEM fields.

    With respect to race/ethnicity, we observed that URM students reported more positive perspectives on instructor organization, personal engagement, and learning environment categories. This result is encouraging, as previous work suggested that URM students experienced barriers because their preferred methods of learning were not used in the classroom (Palmer et al., 2013). Additionally, our finding that race/ethnicity was not related to technical resources differs from older work reporting differences in technology access (Muilenburg and Berge, 2005; Porter and Donthu, 2006). More recent work suggests that this technology access gap may be closing (Wladis et al., 2015), and our results suggest high levels of technology access for both URM and non-URM students. Conversely, similar to previous studies, URM students expressed greater barriers to social interactions (Muilenburg and Berge, 2005), suggesting that these students may feel more disconnected from their classmates. URM students also perform better in courses taught by someone from a diverse background (Fairlie et al., 2014), suggesting that this social disconnection applies not only to peers, but may apply to instructors as well.

    Considering other demographic attributes, we found that sophomore and senior students have fewer barriers to social interaction, perhaps differing from first-year students because they have had time to form social connections on campus. First-generation status had significant effects on online FA access related to personal engagement. This aligns with previous studies that found that first-generation students had lower buy-in to some online FA assignments, which was hypothesized to stem from differences in their familiarity with collegiate expectations or external time commitments (Brazeal and Couch, 2017). We found that students who did not speak English at home had lower technical resources, instructor organization, and personal engagement, which suggests that this characteristic is associated with multifaceted and potentially interrelated challenges to online homework. Previous studies did not include student language, so this finding provides a new avenue to explore regarding online FA accessibility.

    Finally, we found that community college students had fewer barriers in the instructor organization category. This difference may be explained by the smaller class size of community college courses, and small class sizes have shown positive correlations with academic achievement (Shin and Chung, 2009). This smaller class size allows for more one-on-one time with the instructor, fostering more regular communication between instructors and students. We also identified demographic characteristics that can be attributed to the specific institution types, but these results are not generalizable to entire institution types. Rather, these findings provide an example of how two contexts might have different relationships between student characteristics and FA barriers. While the underlying reasons for demographic patterns remain complex and the specific results reflect the local contexts, these findings provide a motivation and entry point for instructors to think about how students from different backgrounds might interact with their homework assignments.

    Connections between Online FA Access and Course Performance

    We found that select barrier categories were related to course performance. This is an important finding, because few studies exist on how access to online FAs can affect a student’s broader course outcomes. We were interested in studying these connections, because we anticipated a relationship between barriers and final course grades: Students unable to complete assignments due to access issues likely see a drop in their FA grades as well as their associated exam performance. We found that barriers in the technical resources, instructor organization, and personal engagement categories were associated with lower course performance (Table 8). The relationships between barriers and course performance point to the potential consequences of students having different access to core learning activities in their courses. However, we found no association between number of categories with sufficient access and course performance, suggesting specificity in these connections (Supplemental Table 13). Because demographics were included in the models, these significant relationships suggest that individuals with similar demographic traits can have differences in the barriers they experience, and these differences relate to course performance (Supplemental Tables 7–12). Furthermore, the finding that several demographic variables still related to course performance, even after accounting for online FA barriers, indicates that students from these groups face additional challenges, such as stereotype threat or test anxiety (Steele, 1997).

    Students Identify Supports They Need to Improve Their Online FA Experiences

    Our last goal was to elicit suggestions regarding ways to alleviate barriers. While studies have characterized the barriers students face when learning online, few have asked students about the supports needed to eliminate the identified barriers. We asked students what they think could be done to increase their online FA access. These suggestions not only provided ways to potentially improve the online FA experience but also allowed us to understand factors that contribute to underlying barriers. Additionally, these suggestions reflect the important roles that both instructors and students have in addressing barriers. Students provided specific suggestions that their instructors could implement but also identified that they view some components of online FA access as their own responsibility (e.g., turning off distractions, emailing the instructor when they are confused). Students often offered suggestions outside the specific barrier item or category, suggesting that improving access involves a comprehensive array of options. Similarly, students identified instructor characteristics not directly tied to the FA, such as friendliness and approachability, suggesting additional influences that an instructor has on student engagement outside FA design and administration.

    Responses from students who indicated limited access to an item are particularly valuable, because they provide insight into how a barrier they identified could be addressed. Students in this group most commonly suggested changes to the scheduling logistics; devices, Internet, or other technical resources; and FA format (Figure 5). Students experiencing barriers made suggestions about the due dates and time given to complete an assignment, desiring consistent due dates and ample time for completion. With respect to devices, Internet, or other technical resources, students often requested that the institution provide the necessary hardware and software. Finally, when discussing FA format, students requested changes to the style of question (e.g., multiple-choice or questions requiring videos). In most cases, students made suggestions that would reduce the time requirement or mental load required for the online FA. Taken together, these responses suggest that for students experiencing online FA barriers, instructors and institutions can improve engagement by providing ample time for online FA completion, providing the necessary technical resources, and designing digestible online FAs.

    Limitations

    For researchers and instructors to gain a fair picture of student online FA access, it is important to discuss study limitations. First, we must consider the confounding effects of response bias. Though our survey was designed to investigate barriers to online FAs, the survey itself was administered online. Students lacking sufficient access to the Internet or electronic devices may therefore have been less able to complete the survey. Similarly, students who had limited access due to other factors, such as time constraints, low self-efficacy, or suboptimal learning environments may also have had less opportunity to complete the survey. Thus, the results presented here may systematically underestimate the barriers faced by the full range of students in a course. Six percent of enrolled students were excluded due to nonconsent but had sufficient Internet access and learning environments to complete the survey. This leaves an important group of roughly 15% of enrolled students for whom we lack survey information and who may face disproportionate barriers. Second, while we wished to include all individuals in the statistical models, the sample sizes for some demographic groups were small and therefore potentially less representative of the broader group. Similarly, while each institution is unique, the sample sizes for the 2-year institutions were small, and therefore we grouped them together for analysis. Finally, this study was conducted during the COVID-19 pandemic, which resulted in widespread disruptions to course delivery and social interactions. While we certainly saw indications of these challenges in our data and the responses may have indicated more barriers than prepandemic conditions, we still view the results as informing our broader understanding of the educational system.

    Implications for Instruction

    While out-of-class barriers may seem beyond an instructor’s purview, our findings highlight potential opportunities instructors have to make their online FAs more accessible. Students made a variety of suggestions for ways that an instructor might directly alter assignments, and given our finding that students generally value FA activities and can describe important ways that FA assignments support their learning (Brazeal et al., 2016), we propose that instructors should take student suggestions into consideration and make changes to optimize student access. Students also gave responses that instructors may wish to address through increased messaging, such as by pointing students toward existing resources, creating increased visibility around course structures or assignment features, or providing more explicit rationale regarding activity design. While previous research has found that many instructors provide this type of guidance on the first day of class (Lane et al., 2021; Meaders et al., 2021), we have also noticed that students seem to most readily recall this messaging when it has been reiterated consistently across the semester (Brazeal et al., 2021).

    Students identified various ways that barriers from each category can be addressed. To address concerns about technical barriers, instructors can inform students about the device and software assistance available to them at their institutions. Many institutions offer device rental programs or have open-access computer labs, but students may be unaware of these resources. The barriers that students identify with respect to instructor organization have perhaps the clearest implications for instruction, because instructors have more direct control over these aspects. For this category, students noted the importance of instructor qualities, such as approachability, as well as suggestions regarding the FA itself, such as assignment logistics, requirements, alignment, content, format, and grading. The social category had the lowest scores, and while instructors have less control over students’ out-of-class interactions, students made several suggestions for facilitating collaboration, such as by arranging study groups or providing online discussion boards. While social interactions form a basis for learning, students may need additional support and guidance on how to productively structure their out-of-class interactions to facilitate learning. Finally, when asked about their personal engagement and learning environment, students again had many suggestions related to the FA itself. While these suggestions represent a composite across several courses, instructors can administer the survey to their own students to better understand how to address the barriers faced in their particular course contexts. Figure 7 provides a road map for instructors who wish to survey their students and use the results to improve online FA engagement. This approach allows instructors to make targeted adjustments according to their student populations.

    FIGURE 7.

    FIGURE 7. Road map for instructors. Resources listed are Tables 3, 6, and 10, Figure 7, and Supplemental Table 1 within this paper.

    ACKNOWLEDGMENTS

    We thank the instructors who distributed the survey and the students who participated in this research. This material is based upon work supported by the National Science Foundation (NSF; DUE-1610621 and DUE-2044243). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF.

    REFERENCES

  • Aagaard, J. (2015). Drawn to distraction: A qualitative study of off-task use of educational technology. Computers & Education, 87, 90–97. https://doi.org/10.1016/j.compedu.2015.03.010 Google Scholar
  • Abuhammad, S. (2020). Barriers to distance learning during the COVID-19 outbreak: A qualitative review from parents’ perspective. Heliyon, 6(11), e05482. https://doi.org/10.1016/j.heliyon.2020.e05482 MedlineGoogle Scholar
  • Ainscow, M. (2016). Diversity and equity: A global education challenge. New Zealand Journal of Educational Studies, 51(2), 143–155. https://doi.org/10.1007/s40841-016-0056-x Google Scholar
  • Alexander, C., Chen, E., & Grumbach, K. (2009). How leaky is the health career pipeline? Minority student achievement in college gateway courses. Academic Medicine: Journal of the Association of American Medical Colleges, 84(6), 797–802. https://doi.org/10.1097/ACM.0b013e3181a3d948 MedlineGoogle Scholar
  • Alruwais, N. (2018). Advantages and challenges of using e-assessment. International Journal of Information and Education Technology, 8, 34–37. https://doi.org/10.18178/ijiet.2018.8.1.1008 Google Scholar
  • Anders, K., & Simon, H. A. (1980). Verbal reports as data. Psychological Review, 87(3), 215–251. https://doi.org/10.1037/0033-295X.87.3.215 Google Scholar
  • Assareh, A., & Hosseini Bidokht, M. (2011). Barriers to e-teaching and e-learning. Procedia Computer Science, 3, 791–795. https://doi.org/10.1016/j.procs.2010.12.129 Google Scholar
  • Attia, N. A., Baig, L., Marzouk, Y. I., & Khan, A. (2017). The potential effect of technology and distractions on undergraduate students’ concentration. Pakistan Journal of Medical Sciences, 33(4), 860–865. https://doi.org/10.12669/pjms.334.12560 MedlineGoogle Scholar
  • Baleni, Z. G. (2015). Online formative assessment in higher education: Its pros and cons. Electronic Journal of e-Learning, 13(4), 228–326. Google Scholar
  • Bates, R., & Khasawneh, S. (2007). Self-efficacy and college students’ perceptions and use of online learning systems. Computers in Human Behavior, 23(1), 175–191. https://doi.org/10.1016/j.chb.2004.04.004 Google Scholar
  • Becker, K., Newton, C., & Sawang, S. (2013). A learner perspective on barriers to e-learning. Australian Journal of Adult Learning, 53, 35–57. Google Scholar
  • Bers, T. H., & Smith, K. E. (1991). Persitence of community college students: The influence of student intent and academic and social integration. Research in Higher Education, 32(5), 539–556. https://doi.org/10.1007/BF00992627 Google Scholar
  • Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21(1), 5–31. https://doi.org/10.1007/s11092-008-9068-5 Google Scholar
  • Boekaerts, M., & Corno, L. (2005). Self-regulation in the classroom: A perspective on assessment and intervention. Applied Psychology: An International Review, 54(2), 199–231. https://doi.org/10.1111/j.1464-0597.2005.00205.x Google Scholar
  • Boitshwarelo, B., Reedy, A. K., & Billany, T. (2017). Envisioning the use of online tests in assessing twenty-first century learning: A literature review. Research and Practice in Technology Enhanced Learning, 12(1), 16. https://doi.org/10.1186/s41039-017-0055-7 MedlineGoogle Scholar
  • Boston, C. (2002). The concept of formative assessment. Practical Assessment, Research, and Evaluation, 8, article 9https://doi.org/10.7275/KMCQ-DJ31 Google Scholar
  • Brazeal, K. R., Brown, T. L., & Couch, B. A. (2016). Characterizing student perceptions of and buy-in toward common formative assessment techniques. CBE—Life Sciences Education, 15(4), ar73. https://doi.org/10.1187/cbe.16-03-0133 LinkGoogle Scholar
  • Brazeal, K. R., Brown, T. L., & Couch, B. A. (2021). Connecting activity implementation characteristics to student buy-in toward and utilization of formative assessments within undergraduate biology courses. Journal for STEM Education Research, 4, 329–362. Google Scholar
  • Brazeal, K. R., & Couch, B. A. (2017). Student buy-in toward formative assessments: The influence of student factors and importance for course success. Journal of Microbiology & Biology Education, 18(1). https://doi.org/10.1128/jmbe.v18i1.1235 Google Scholar
  • Choy, S. (2001). Students whose parents did not go to college: Postsecondary access, persistence, and attainment. Washington, DC: National Center for Education Statistics, U.S. Department of Education. Retrieved April 11, 2022, from https://nces.ed.gov/pubs2001/2001126.pdf Google Scholar
  • Compton, J. I., Cox, E., & Laanan, F. S. (2006). Adult learners in transition. New Directions for Student Services, 2006(114), 73–80. https://doi.org/10.1002/ss.208 Google Scholar
  • Dorman, J. P. (2001). Associations between classroom environment and academic efficacy. Learning Environments Research, 4(3), 243–257. https://doi.org/10.1023/A:1014490922622 Google Scholar
  • Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: How and for whom does increasing course structure work? CBE—Life Sciences Education, 13(3), 453–468. https://doi.org/10.1187/cbe.14-03-0050 LinkGoogle Scholar
  • Elmahdi, I., Al-Hattami, A., & Fawzi, H. (2018). Using technology for formative assessment to improve students’ learning. Turkish Online Journal of Educational Technology—TOJET, 17(2), 182–188. Google Scholar
  • Estrada, M., Burnett, M., Campbell, A. G., Campbell, P. B., Denetclaw, W. F., Gutiérrez, C. G., ... & Zavala, M. (2016). Improving underrepresented minority student persistence in STEM. CBE—Life Sciences Education, 15(3), es5. https://doi.org/10.1187/cbe.16-01-0038 LinkGoogle Scholar
  • Evans, D. J. R., Zeun, P., & Stanier, R. A. (2014). Motivating student learning using a formative assessment journey. Journal of Anatomy, 224(3), 296–303. https://doi.org/10.1111/joa.12117 MedlineGoogle Scholar
  • Fairlie, R. W., Hoffmann, F., & Oreopoulos, P. (2014). A community college instructor like me: Race and ethnicity interactions in the classroom. American Economic Review, 104(8), 2567–2591. https://doi.org/10.1257/aer.104.8.2567 Google Scholar
  • Flanigan, A. E., & Babchuk, W. A. (2022). Digital distraction in the classroom: Exploring instructor perceptions and reactions. Teaching in Higher Education, 27(3), 352–370. https://doi.org/10.1080/13562517.2020.1724937 Google Scholar
  • Flanigan, A. E., & Kiewra, K. A. (2018). What college instructors can do about student cyber-slacking. Educational Psychology Review, 30(2), 585–597. https://doi.org/10.1007/s10648-017-9418-2 Google Scholar
  • Freeman, S., Haak, D., & Wenderoth, M. P. (2011). Increased course structure improves performance in introductory biology. CBE—Life Sciences Education, 10(2), 175–186. https://doi.org/10.1187/cbe.10-08-0105 LinkGoogle Scholar
  • Gaytan, J., & McEwen, B. C. (2007). Effective online instructional and assessment strategies. American Journal of Distance Education, 21, 117–132. https://doi.org/10.1080/08923640701341653 Google Scholar
  • Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004 Google Scholar
  • Greenstein, L. (2010). What teachers really need to know about formative assessment. Alexandria, VA: ASCD. Google Scholar
  • Haak, D. C., HilleRisLambers, J., Pitre, E., & Freeman, S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216. https://doi.org/10.1126/science.1204820 MedlineGoogle Scholar
  • Heuer, B., & King, K. (2004). Leading the band: The role of the instructor in online learning for educators. Journal of Interactive Learning, 3, 1–11. Google Scholar
  • Hill, M. C., & Epps, K. K. (2010). The impact of physical classroom environment on student satisfaction and student evaluation of teaching in the university environment. Academy of Educational Leadership Journal, 14(4), 16. Google Scholar
  • Hurtado, S., Cabrera, N. L., Lin, M. H., Arellano, L., & Espinosa, L. L. (2009). Diversifying science: Underrepresented student experiences in structured research programs. Research in Higher Education, 50(2), 189–214. https://doi.org/10.1007/s11162-008-9114-7 MedlineGoogle Scholar
  • IBM. (2020). SPSS (Version 27). [Macintosh]. Armonk, NY. Google Scholar
  • Jaggars, S. (2011). Online learning: Does it help low-income and underprepared students? (CCRC Working Paper 26). Assessment of Evidence Series. New York: Community College Research Center, Teachers College, Columbia University. https://doi.org/10.7916/D82R40WD Google Scholar
  • Jung, I., Choi, S., Lim, C., & Leem, J. (2002). Effects of different types of interaction on learning achievement, satisfaction and participation in Web-based instruction. Innovations in Education and Teaching International, 39(2), 153–162. https://doi.org/10.1080/14703290252934603 Google Scholar
  • Khan, S., & Khan, R. A. (2019). Online assessments: Exploring perspectives of university students. Education and Information Technologies, 24(1), 661–677. https://doi.org/10.1007/s10639-018-9797-0 Google Scholar
  • Knight, J. K., Wise, S. B., & Southard, K. M. (2013). Understanding clicker discussions: Student reasoning and the impact of instructional cues. CBE—Life Sciences Education, 12(4), 645–654. https://doi.org/10.1187/cbe.13-05-0090 LinkGoogle Scholar
  • Knowles, E., & Kerkman, D. (2007). An investigation of students attitude and motivation toward online learning. InSight: A Collection of Faculty Scholarship, 2, 70–80. Google Scholar
  • Laal, M., & Ghodsi, S. M. (2012). Benefits of collaborative learning. Procedia—Social and Behavioral Sciences, 31, 486–490. https://doi.org/10.1016/j.sbspro.2011.12.091 Google Scholar
  • Lane, A. K., Meaders, C. L., Stetzer, M. R., Vinson, E., Couch, B. A., Smith, M. K., & Stains, M. (2021). Making a first impression: Exploring what instructors do and say on the first day of introductory STEM courses. CBE—Life Sciences Education, 7, 1–11. Google Scholar
  • Lei, S. A. (2010). Classroom physical design influencing student learning and evaluations of college instructors: A review of literature. Education, 131(1), 128–134. Google Scholar
  • Levy, J., Wubbels, T., Brekelmans, M., & Morganfield, B. (1997). Language and cultural factors in students’ perceptions of teacher communication style. International Journal of Intercultural Relations, 21(1), 29–56. https://doi.org/10.1016/S0147-1767(96)00005-3 Google Scholar
  • Magalhães, P., Ferreira, D., Cunha, J., & Rosário, P. (2020). Online vs traditional homework: A systematic review on the benefits to students’ performance. Computers & Education, 152, 103869. https://doi.org/10.1016/j.compedu.2020.103869 Google Scholar
  • Matz, R. L., Koester, B. P., Fiorini, S., Grom, G., Shepard, L., Stangor, C. G., ... & McKay, T. A. (2017). Patterns of gendered performance differences in large introductory courses at five research universities. AERA Open, 3(4). https://journals.sagepub.com/doi/full/10.1177/2332858417743754 Google Scholar
  • Meaders, C. L., Senn, L. G., Couch, B. A., Lane, A. K., Stains, M., Stetzer, M. R., ... & Smith, M. K. (2021). Am I getting through? Surveying students on what messages they recall from the first day of STEM classes. International Journal of STEM Education, 8(1), 49. https://doi.org/10.1186/s40594-021-00306-y MedlineGoogle Scholar
  • Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A factor analytic study. Distance Education, 26(1), 29–48. https://doi.org/10.1080/01587910500081269 Google Scholar
  • Mungania, P. (2004). Employees’ perceptions of barriers in e-Learning: The relationship among barriers, demographics, and e-Learning self-efficacy. University of Louisville. https://doi.org/10.18297/etd/1027 Google Scholar
  • Natriello, G. (1987). Evaluation processes in schools and classrooms (No. 12). New York: Columbia University. Google Scholar
  • Offerdahl, E. G., McConnell, M., & Boyer, J. (2018). Can I have your recipe? Using a fidelity of implementation (FOI) framework to identify the key ingredients of formative assessment for learning. CBE—Life Sciences Education, 17(4), es16. https://www.lifescied.org/doi/full/10.1187/cbe.18-02-0029 LinkGoogle Scholar
  • Palmer, G., Bowman, L., & Harroff, P. (2013, May 29). Literature review: Barriers to participation in the online learning environment: The role of race and gender. Retrieved February 2, 2022, from https://newprairiepress.org/aerc/2013/papers/36 Google Scholar
  • Porter, C. E., & Donthu, N. (2006). Using the technology acceptance model to explain how attitudes determine Internet usage: The role of perceived access barriers and demographics. Journal of Business Research, 59(9), 999–1007. https://doi.org/10.1016/j.jbusres.2006.06.003 Google Scholar
  • Qualtrics. (2020). https://www.qualtrics.com. Google Scholar
  • Rayner, G. M. (2008). Using ‘mastering biology’ to formatively improve student engagement and learning in first year biology. In Andre, K.Duff, A.Green, M. M.Nandurkar, O.Quinn, D. (Eds.), Proceedings, ATN Assessment Conference 2008: Engaging Students in Assessment (pp. 1–11). Adelaide, Australia: University of South Australia. Google Scholar
  • Rosseel, Y. (2012). lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1–36. https://doi.org/10.18637/jss.v048.i02 Google Scholar
  • Rovai, A. P. (2000). Online and traditional assessments: What is the difference? The Internet and Higher Education, 3(3), 141–151. https://doi.org/10.1016/S1096-7516(01)00028-8 Google Scholar
  • Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. https://doi.org/10.1007/BF00117714 Google Scholar
  • SAS Institute. (2020). JMP Pro 15. Cary, NC. Google Scholar
  • Satici, S. A., & Can, G. (2016). Investigating academic self-efficacy of university students in terms of socio-demographic variables. Universal Journal of Educational Research, 4(8), 1874–1880. Google Scholar
  • Shin, I.-S., & Chung, J. Y. (2009). Class size and student achievement in the United States: A meta-analysis. KEDI Journal of Educational Policy, 6, 3–19. Google Scholar
  • Sobhanian, S., & Ye, Y. (2016). A comparative study of students’ use of peer learning according to selected demographics in the graduate school of business, assumption University of Thailand. Scholar: Human Sciences, 8(1), 117–117. Google Scholar
  • Soller, A. L. (2001). Supporting social interaction in an intelligent collaborative. International Journal of Artificial Intelligence in Education, 12(1), 40–62. Google Scholar
  • Stebleton, M., & Soria, K. (2013). Breaking down barriers: Academic obstacles of first-generation students at research universities. Learning Assistance Review, 17(2), 7–20. Google Scholar
  • Steele, C. M. (1997). A threat in the air: How stereotypes shape intellectual identity and performance. American Psychologist, 52(6), 613–629. https://doi.org/10.1037/0003-066X.52.6.613 MedlineGoogle Scholar
  • Trumbull, E., & Lash, A. (2013). Understanding formative assessment: Insights from learning theory and measurement theory. San Francisco: WestEd. Google Scholar
  • Wilson, B. C. (2006). Gender differences in types of assignments preferred: Implications for computer science instruction. Journal of Educational Computing Research, 34(3), 245–255. https://doi.org/10.2190/7FLU-VKJL-86RM-5RQG Google Scholar
  • Wladis, C., Conway, K. M., & Hachey, A. C. (2015). The online STEM classroom—who succeeds? An exploration of the impact of ethnicity, gender, and non-traditional student characteristics in the community college context. Community College Review, 43(2), 142–164. https://doi.org/10.1177/0091552115571729 Google Scholar
  • Zhao, L., Lu, Y., Huang, W., & Wang, Q. (2010). Internet inequality: The relationship between high school students’ Internet use in different locations and their Internet self-efficacy. Computers & Education, 55(4), 1405–1423. https://doi.org/10.1016/j.compedu.2010.05.010 Google Scholar