Introduction
The COVID-19 pandemic forced universities worldwide to close their physical campuses and transition to online education. This abrupt shift raised concerns about its impact on student dropout rates, particularly in fields like computer science, which already had high pre-pandemic attrition. This study focuses on the effects of this transition on students at a large public university in Hungary. While examining academic dropout, it also explores issues of inequality and accessibility stemming from the shift to online learning. The study's primary aim is to quantify the impact of this transition on student academic performance and the subsequent implications for merit-based scholarship systems.
Literature Review
Existing literature on student dropout in higher education highlights interactional theories, emphasizing the interplay between student characteristics (personality, prior experience, commitment) and institutional factors (academic and social integration). Tinto's interactional model (1975, 2012) and subsequent research by Pascarella and Terenzini (1983), Braxton and Hirschy (2004), and others underline the crucial role of student engagement and social support in persistence. High dropout rates have significant economic and educational consequences (Di Pietro, 2006; Belloc et al., 2011; Cabrera et al., 2006). The shift to online education, while offering benefits like flexibility and accessibility, has also been linked to higher dropout rates (Carr, 2000; Rovai, 2003; Patterson and McFadden, 2009; Nistor and Neubauer, 2010; Willging and Johnson, 2019; Levy, 2007; Morris et al., 2005). This is often attributed to factors such as decreased social interaction, reduced access to learning support, and the need for greater self-directed learning (Wang et al., 2013; Serdyukov and Hill, 2013). The literature on the impact of the COVID-19 pandemic on student performance shows mixed results, with some studies reporting learning loss and increased stress (Andrew et al., 2020; Bayrakdar and Guveli, 2020; Brown et al., 2020; Bol, 2020; Rahiem, 2021; Abilleira et al., 2021; Daumiller et al., 2021; Engzell et al., 2021; Clark et al., 2021; Daniels et al., 2021; Mendoza et al., 2021), while others find no significant difference or even improved performance in some cases (Said, 2021; Iglesias-Pradas et al., 2021; Gonzalez et al., 2020; Yu et al., 2021). The limited research specifically addressing the impact of COVID-19-related online education on university dropout necessitates further investigation.
Methodology
This study uses data from a computer science BSc program at a large European university, comparing students who started in 2018 (on-campus education) and 2019 (online education from March 2020). A total of 862 students were included. The study employed Item Response Theory (IRT) modeling using the Graded Response Model (GRM) in STATA15 software to analyze student performance across various subjects. IRT allowed the researchers to place student ability and item difficulty on a common scale, enabling a more nuanced comparison of performance between on-campus and online learning environments. The analysis focused on final grades in professional subjects, considering only those with consistent assessment criteria across years. Two key parameters were examined: item difficulty and slope. The slope reflects the item's ability to discriminate between students of varying abilities, while difficulty indicates the item's overall level of challenge. The researchers compared these parameters for each subject across the 2018/2019 and 2019/2020 academic years to assess the impact of the shift to online learning on exam difficulty and the ability of subjects to distinguish between student ability levels. Descriptive statistics were also used to compare grade point averages and dropout rates between the two groups. The study explicitly considered the effect of the COVID-19 lockdown and the introduction of online education, which started from March 2020. The researchers analyzed how the difficulty of obtaining various grade levels (1-5) changed between the on-campus and online settings, using the IRT model to determine the probability of a student achieving a certain grade given their ability level and the item’s difficulty. This provides insight into whether specific grade thresholds became easier or harder to achieve under online learning.
Key Findings
The study's findings challenge some assumptions about the impact of online education during the pandemic. Firstly, the shift to online learning did not lead to a significant increase in the dropout rate. The dropout rate among the 447 students in the online group (19 students) was considerably lower than that of the 438 students in the on-campus group (50 students). Secondly, the IRT analysis revealed that in most subjects, the difficulty of achieving a passing grade decreased in the 2019/2020 online environment compared to the 2018/2019 on-campus setting. This suggests that the overall requirements or grading criteria may have been adjusted, either implicitly or explicitly, by instructors. Importantly, however, while passing grades became more accessible, achieving higher grades (4 or 5) became more difficult under online learning. This suggests a potential shift in grading strategies where passing was prioritized over higher achievement. Finally, despite the easier-to-attain passing grades, students in the online group (2019/2020) had significantly lower average grade points (2.5) than the on-campus group (2018/2019, 3.3). This discrepancy has significant implications for merit-based scholarship systems, which typically rely on grade point averages. This finding highlights a potential inequity introduced by the shift to online learning.
Discussion
The findings reveal a complex interplay between the transition to online education, instructor grading practices, and student academic performance. While the shift to online learning did not lead to a surge in dropout rates, it significantly impacted the distribution of grades. The observation that passing grades became more easily attainable while higher grades became more challenging suggests that instructors might have implicitly or explicitly adjusted expectations or grading criteria to account for the challenges posed by the abrupt transition to online learning. However, this adjustment may have inadvertently led to a less equitable outcome in terms of scholarship allocation. The discrepancy in grade point averages between on-campus and online students raises concerns about the fairness of existing scholarship systems and their potential to inadvertently disadvantage online learners. Future scholarship systems should carefully consider the impact of different modes of education on grading practices and ensure that all learners are assessed equitably. The study's implications extend beyond the immediate context of the COVID-19 pandemic, offering valuable insights into the design and implementation of effective online learning environments and equitable assessment strategies.
Conclusion
This study demonstrates the complex effects of the COVID-19 pandemic and the shift to online learning on student academic performance and retention. While the transition did not drastically increase dropout rates, it significantly altered grade distributions and highlighted potential inequities in scholarship allocation. The findings underscore the need for institutions to develop robust support systems for online learners, including clearer communication of grading criteria and flexible learning structures. Future research should investigate the long-term effects of this transition, explore instructor perspectives on grading strategies during the pandemic, and consider how to create more equitable assessment and scholarship models for diverse learning environments.
Limitations
The study's focus on a single university and a specific computer science program limits the generalizability of the findings. Further research is needed to determine whether these patterns are consistent across other institutions, disciplines, and student populations. The analysis is also limited to the first two semesters of the program, potentially neglecting later effects. Additionally, individual student factors such as prior experience with online learning, access to resources, and personal circumstances are not fully accounted for in this study. Future research should incorporate such variables for a more comprehensive understanding. Finally, while the IRT analysis provides insight into the change in the difficulty of obtaining different grades, it does not directly explain the underlying reasons for these changes, which would require additional qualitative data and a deeper understanding of the grading strategies employed by instructors.
Related Publications
Explore these studies to deepen your understanding of the subject.