Abstract:
This paper explores the realization of viable, scalable, automated, and authentic alternatives to paper-only-based testing within Engineering disciplines. Currently, manu...Show MoreMetadata
Abstract:
This paper explores the realization of viable, scalable, automated, and authentic alternatives to paper-only-based testing within Engineering disciplines. Currently, manual delivery and grading of paper-based exams incurs vast logistical burdens that have low impact to learning achievement, especially as enrollments increase. Meanwhile, Engineering's design-oriented and problem-solving emphases pose substantial challenges to the digitized delivery of assessments, and thus they warrant a substantive evaluation of their validity. To address this research need, novel Computer-Based Assessment (CBA) infrastructures and delivery protocols were launched via an IRB-approved crossover study to investigate the impact of lockdown-proctored digitized quiz and exam delivery in terms of test score validity, and learning achievement within a large-size undergraduate Mechanical and Aerospace Engineering (MAE) course. Results indicate that well-formed CBAs can determine scores differing as little as 0.6% from Paper-Based Assessments (PBA). Student achievement was de-correlated by technical topic of the assessment delivery mode during crossover and results revealed that the CBA delivery and remediation cohort attained up to 16.9% higher learning outcomes during summative assessment. The encouraging results are discussed in detail along with lessons learned, and suggestions for transportability of CBA approaches to other Engineering courses and institutions.
Published in: 2018 IEEE Frontiers in Education Conference (FIE)
Date of Conference: 03-06 October 2018
Date Added to IEEE Xplore: 07 March 2019
ISBN Information: