logo
ResearchBunny Logo
Introduction
The COVID-19 pandemic significantly disrupted education globally, affecting nearly 1.6 billion students. The subsequent shift to digital education, while offering advantages like improved access to data and diverse learning resources, also risks exacerbating existing digital inequities. Higher engineering education, increasingly adopting an Outcome-Based Education (OBE) approach, faces particular challenges in ensuring equitable curriculum design and assessment of online practical courses. These challenges stem from factors such as unequal access to digital resources and technology, variations in digital literacy, and complexities in online teacher-student interaction. This study aims to address these challenges by developing a digital teaching assistant system designed to promote educational equity in higher engineering disciplines.
Literature Review
Many digital assistance systems exist, but often lack features suitable for the specific needs of online engineering practical courses. A review of existing research highlights shortcomings in curriculum planning, such as insufficient problem-solving practices, peer collaboration, and student support, leading to lower course completion rates and engagement. The assessment of learning outcomes in such courses also presents significant challenges due to the subjective nature of evaluating design-based projects. While some systems utilize collaborative filtering for resource recommendation or employ data mining techniques to predict student performance, a comprehensive solution addressing the multifaceted aspects of digital equity in higher engineering education is lacking. The reviewed literature reveals that existing systems often focus on only one or two dimensions of educational equity (e.g., digital technology or user interaction), neglecting the integrated nature of the problem. This includes human factors (student attitudes and skills), environmental factors (learning environment), and technological factors (system stability and accessibility).
Methodology
This study proposes a multi-criteria group decision-making (MCGDM) model that integrates Quality Function Deployment (QFD) and the t-test to address the challenges of curriculum planning and learning outcome assessment. QFD helps to translate user needs (core competencies) into course grading criteria, while the t-test statistically analyzes the significance of differences between learning outcomes under different teaching methods. An intelligent online teaching assistant system was developed based on this model. The system, built using a B/S (Browser/Server) architecture with an AMP stack (Apache, MySQL, PHP), features visualization tools for displaying learning processes and outcomes, as well as intelligent assessment analysis functionalities. A case study involving a Design Management and Strategy course with 48 senior industrial design students was conducted to evaluate the system. Students worked in teams to complete tasks using Brainstorming and Crazy 8 methods, submitting their work through the system. Three teachers evaluated student work based on the QFD-derived criteria, and the system generated statistical results using the t-test. Teacher and student satisfaction surveys were also administered to compare the new system with an existing system (System A) widely used in Chinese higher engineering schools. The data were analyzed using SPSS 19.
Key Findings
The QFD analysis identified five key course grading criteria: Reasonable, Elaboration, Logicality, Insightful, and Collaboration. The t-test results showed a significant difference in learning outcomes between the Brainstorming and Crazy 8 methods, with Brainstorming generally performing better, except for Elaboration, where Crazy 8 excelled. Comparison with System A revealed that the new system (System B) led to significantly better academic performance (p<0.05). Teacher and student satisfaction surveys indicated that System B was superior to System A in several key areas. Teachers highly valued System B's multi-dimensional learning evaluation analysis reports and its teacher-student interaction features. Students favored System B's interface for publishing course learning tasks and learning grading standards, timely feedback on evaluation results, and its overall support for collaborative learning. The Cronbach's alpha values for both teacher and student questionnaires exceeded 0.9, indicating high reliability. The KMO values also suggested good validity for factor analysis.
Discussion
The findings demonstrate the effectiveness of the developed digital teaching assistant system in promoting educational equity in online engineering practical courses. The integration of QFD and t-test provided a robust and objective method for evaluating learning outcomes, addressing the subjectivity inherent in assessing design-based projects. The superior performance of System B compared to System A highlights the importance of designing systems that specifically cater to the needs and characteristics of higher engineering disciplines. The high levels of teacher and student satisfaction underscore the system's user-friendliness and its ability to enhance both teaching and learning experiences. The results suggest that a multi-faceted approach, incorporating aspects of curriculum design, assessment methods, and technological features, is crucial for ensuring equitable digital education.
Conclusion
This study successfully developed and implemented a digital teaching assistant system that effectively promotes educational equity in online higher engineering education. The system's strengths lie in its alignment with the unique characteristics of engineering disciplines, its objective assessment methods, and its user-friendly design, leading to improved learning outcomes and high teacher-student satisfaction. Future research could focus on expanding the system's functionality, conducting further comparative studies with other systems, and exploring its applicability across diverse engineering disciplines and educational contexts.
Limitations
The study's scope was limited to a single course and institution. The generalizability of the findings to other courses, institutions, and educational systems may require further investigation. The sample size, while sufficient for statistical analysis, might not fully capture the diversity of student populations and learning styles. The study primarily focused on the technical aspects and user experiences, while other factors contributing to educational equity, such as institutional policies and socio-economic factors, were not explicitly considered.
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs—just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny