logo
ResearchBunny Logo
Remote learning slightly decreased student performance in an introductory undergraduate course on climate change

Education

Remote learning slightly decreased student performance in an introductory undergraduate course on climate change

S. Ghosh, S. Pulford, et al.

This insightful study by Sattik Ghosh, Stephanie Pulford, and Arnold J. Bloom explored how 1790 undergraduates performed in online vs. face-to-face climate change courses. Interestingly, while online students scored 2% lower, the benefits of flexibility and accessibility might just tip the scales in favor of online education.... show more
Introduction

The study examines whether online instruction yields comparable learning outcomes to face-to-face instruction in an introductory undergraduate climate change course. Motivated by the growing reliance on online resources for public understanding of climate change, the increased prevalence of online courses during COVID-19, and longstanding debates about the efficacy and equity of distance education, the authors aim to provide an evidence-based comparison minimizing instructor and content confounds. The central research questions are: (1) how do student outcomes differ between online and face-to-face versions of the same course taught by the same instructor, (2) how do student demographics and selection into formats relate to outcomes, and (3) did the COVID-19 pandemic affect performance in the online format.

Literature Review

The paper situates its inquiry within a long history of debate on distance education, noting prior skepticism about quality despite equity motivations. A 2010 U.S. Department of Education meta-analysis suggested online learning is as effective as classroom instruction, but a re-evaluation found only four appropriate studies, three of which showed poorer online outcomes. Several large-scale U.S. studies reported substantially poorer persistence and grades in online courses, though many comparisons confounded subject differences, instructors, and small sample sizes, limiting causal inference. The authors emphasize the need to isolate format effects from selection biases and to consider equity implications since underserved students often disproportionately enroll online.

Methodology

Design: Post-hoc pseudo-experimental analysis of 1,790 undergraduates at the University of California, Davis, enrolled in either an online or face-to-face version of the same introductory climate change course (SAS 25/25v). Both formats were taught by a single primary instructor across most terms; one term had a second instructor teaching both formats concurrently, minimizing instructor-related confounds. Offerings: Both formats in eight Winter quarters (2013–2020; except 2016 taught by a second instructor), online only in most Spring quarters (2013, 2014, 2015, 2017, 2018), and online only during Winter and Spring 2021 due to COVID-19. Data: Student total course grade (0–100) and disaggregated assessment scores (quizzes, writing assignments, midterm, final, participation), along with demographics (e.g., gender, senior status, humanities major, URM, first-generation, low income, language at home) and prior academic performance (GPA). Student surveys in Winter 2019 and during COVID-19 offerings probed past online learning experiences and format choice motivations. Course parity: Identical content across formats; all students had access to recorded short video lectures and live lecture materials. Discussion sections differed in format: online synchronous via video conferencing (up to 15 students; more meeting-time options) vs. face-to-face on campus (up to 25 students). Assessments: Weekly online quizzes (10–12 MCQs drawn from ~50-item pools tied to the online textbook), weekly writing assignments (alternating exercises and essays), discussion participation (mostly attendance-based), proctored midterm (25 MCQs + one essay), and proctored final (50 MCQs + one essay). Statistical analysis: Ordinary least squares linear regressions (R 4.0.3, lm) estimated three models of total course grade: Model 1 included only format (Online vs Face-to-Face) or COVID indicator; Model 2 added controls for language at home, male, senior, humanities; Model 3 further added GPA, URM, low income, and first-generation status. Additional analyses compared outcomes between those who chose face-to-face in Winter and those who took online when it was the only option (Spring), and contrasted performance by assessment type to parse dependence on course format. Significance assessed via Wald tests.

Key Findings
  • Online format penalty: Students in the online version scored approximately 2 percentage points lower than face-to-face peers on a 0–100 scale (e.g., Model 1 Online coefficient −2.07; Model 2 −1.99; Model 3 −2.00; all statistically significant). - COVID-19 effect: No significant difference in online student grades before vs during the pandemic; pandemic per se showed no significant effect (Table S1). - Selection patterns: Women, seniors, and humanities majors disproportionately selected the online format in Winter quarters when both formats were offered. English language learners and humanities majors were overrepresented online due to scheduling and accessibility. - Predictors of performance: GPA was the strongest positive predictor of course grade across models. Seniors and humanities majors earned significantly lower grades than peers; URM students also had lower grades in some models. First-generation students and those speaking mixed languages at home had slightly higher grades in certain specifications; low-income status did not significantly differ overall. - Assessment-specific results: No significant difference between formats on weekly quizzes (which drew solely from online textbook content), but online students scored lower on writing assignments, midterm, and final exams when formats coexisted in Winter terms. - Participation: Online students attended 5.5% fewer discussion sections on average than face-to-face students, potentially contributing to lower performance on writing and exams.
Discussion

The findings indicate a small but consistent performance penalty for the online format when students had a choice, concentrated in assessments more dependent on interactions and discussion (writing, exams), while outcomes were equivalent on textbook-based quizzes. Lower participation in online discussion sections likely contributed to these differences. Importantly, when comparing students who had to take online (Spring) with those who chose face-to-face (Winter), outcomes did not significantly differ, suggesting selection into formats affects observed differences. Overall, the small average penalty may be an acceptable tradeoff considering online accessibility and scheduling benefits, especially when course content and instruction are carefully aligned across formats. The absence of a pandemic effect on online outcomes suggests that, for courses structured similarly to this one, online delivery can sustain learning outcomes with careful design.

Conclusion

Undergraduates choosing the online version of an introductory climate change course performed about 2 percentage points worse than those in the face-to-face format, while pandemic conditions did not significantly affect online outcomes. Given the convenience and accessibility of the online format, this modest penalty may be acceptable for many students, including those with employment, family responsibilities, athletics, study-abroad, or public health constraints. Future research should: (1) leverage expanded pandemic-era datasets to compare formats without selection bias, (2) develop and test interventions to improve engagement and participation in online discussion sections, and (3) evaluate generalizability to more technical, advanced, lab-based, or group-project-intensive courses.

Limitations
  • Nonrandom selection into formats before the pandemic introduces selection bias; although regressions controlled for observed characteristics and alternative comparisons were used, unobserved differences may remain. - The study focuses on a single introductory, lower-division elective course at one institution, limiting generalizability to advanced, technical, lab, or group-project courses. - Regression-based approaches cannot fully account for all selection effects; supplemental materials discuss these constraints. - Participation metrics emphasized attendance and may not capture qualitative differences in engagement.
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs, just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny