Abstract
Fostering metacognition can be challenging within large enrollment settings, particularly within STEM fields concentrating on problem-solving skills and their underlying theories. Herein, the research problem of realizing more frequent, insightful, and explicitly-rewarded metacognition activities at significant scale is investigated via a strategy utilizing a hierarchy of assessments. Referred to as the STEM-Optimal Digitized Assessment Strategy (SODAS), this targeted approach engages frequent assessment, instructor feedback, and learner self-reflection across the hierarchy of learning mechanisms comprising Bloom’s Taxonomy of Learning Domains. SODAS spans this hierarchy of learning mechanisms via a progression of (i) unregulated online assessment, (ii) proctored Computer-Based Assessment (CBA), (iii) problem-based learning activities assessed in the laboratory setting, and (iv) personalized Socratic discussions of scanned scrap sheets that accompanied each learner’s machine-graded formative assessments. Results of a case study integrating SODAS within a high-enrollment Mechanical Engineering Heat Transfer course at a large state university are presented for enrollment of 118 students. Six question types were delivered with lockdown proctored testing via auto-grading within the Canvas Learning Management System (LMS), along with bi-weekly laboratory activities to address the higher layers of Bloom’s Taxonomy. Sample assessment formats were validated through student use and schedules of responsibilities for instructors across four tiers of assessment levels (facts, concepts, procedures, and metacognition), two testing delivery mechanisms (electronic textbook exercises and proctored CBA), and three remediation mechanisms (self-paced, score clarification, and experiment clarification), which showed that learning achievement can increase by up to 16.9% compared to conventional assessment strategies, while utilizing comparable instructor resources and workloads.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Atoum, Y., Chen, L., Liu, A. X., Hsu, S. D., & Liu, X. (2017). Automated online exam proctoring. IEEE Transactions on Multimedia, 19(7), 1609–1624.
Brown, A. L., & Ferrara, R. A. (1999). Diagnosing zones of proximal development. L. Vygotsky: Critical assessments: The zones of proximal development, 3, 225–256.
Bruner, J. S. (1966). Toward a theory of instruction (Vol. 59): Harvard University Press.
Chen, B., West, M., & Zilles, C. (2017). Do Performance Trends Suggest Wide-spread Collaborative Cheating on Asynchronous Exams? Paper presented at the proceedings of the fourth (2017) ACM Coference on learning@ scale.
Chen, B., Bastedo, K., & Howard, W. (2018a). Exploring design elements for online STEM courses: Active learning, engagement & assessment design. Online Learning, 22(2), 59–75. doi:https://doi.org/10.24059/olj.v22i2.1369
Chen, B., DeMara, R. F., Salehi, S., & Hartshorne, R. (2018b). Elevating learner achievement using formative electronic lab assessments in the Engineering Laboratory: A viable alternative to weekly lab reports. IEEE Transactions on Education, 61(1), 1–10.
DeMara, R. F., Khoshavi, N., Pyle, S., Edison, J., Hartshorne, R., Chen, B., & Georgiopoulos, M. (2016), Redesigning Computer Engineering Gateway Courses Using a Novel Remediation Hierarchy. In Proceedings of American Association for Engineering Education Annual Conference and Exhibition, New Orleans, LA.
Gabriel, J. (2009). College testing lab: New name. UCF Today: Upgrades.
Hillier, M. (2014). The very idea of e-exams: Student (pre) conceptions. Paper presented at the Australasian Society for Computers in Learning in Tertiary Education Conference.
Holland, J. M., Major, D. A., & Orvis, K. A. (2012). Understanding how peer mentoring and capitalization link STEM students to their majors. The Career Development Quarterly, 60(4), 343–354.
Jamil, M., Tariq, R., Shami, P., & Zakariys, B. (2012). Computer-based vs paper-based examinations: Perceptions of university teachers. TOJET: The Turkish Online Journal of Educational Technology, 11(4).
Jansson, P. M., Ramachandran, R. P., Schmalzel, J. L., & Mandayam, S. (2010). Creating an agile ECE learning environment through engineering clinics. Education, IEEE Transactions on, 53(3), 455–462.
Kaya, B. Y., Kaya, G., & Dağdeviren, M. (2014). A sample application of web based examination system for distance and formal education. Procedia-Social and Behavioral Sciences, 141, 1357–1362.
Khan, S., & Khan, R. A. (2018). Online assessments: Exploring perspectives of university students. Education and Information Technologies, 1–17.
Krathwohl, D. R. (2002). A revision of Bloom's taxonomy: An overview. Theory Into Practice, 41(4), 212–218.
Malau-Aduli, B. S., Assenheimer, D., Choi-Lundberg, D., & Zimitat, C. (2014). Using computer-based technology to improve feedback to staff and students on MCQ assessments. Innovations in Education and Teaching International, 51(5), 510–522. https://doi.org/10.1080/14703297.2013.796711.
Meldrum, A. (2013). Using online testing for engineering studies. Engineering Education, 8(2), 77–89.
Moskal, P., Caldwell, R., & Ellis, T. (2009). Evolution of a computer-based testing laboratory. Innovate: Journal of Online Education, 5(6), 6.
Novak, E., Daday, J., & McDaniel, K. (2018). Assessing intrinsic and extraneous cognitive complexity of E-textbook learning. Interacting with Computers, 30(2), 150–161.
Piaget, J. (1970). Science of education and psychology of the child. New York: Oxford University Press.
Rawson, K. A. (2015). The status of the testing effect for complex materials: Still a winner. Educational Psychology Review, 27(2), 327–331.
Singh, S. K., & Tiwari, A. K. (2016). Design and implementation of secure computer based examination system based on b/s structure. International Journal of Applied Engineering Research, 11(1), 312–318.
Sithole, A., Chiyaka, E. T., McCarthy, P., Mupinga, D. M., Bucklein, B. K., & Kibirige, J. (2017). Student attraction, persistence and retention in STEM programs: Successes and continuing challenges. Higher Education Studies, 7(1), 46. Retrieved from. https://doi.org/10.5539/hes.v7n1p46S.
Snyder, T. D., de Brey, C., & Dillow, S. A. (2016). Digest of Education Statistics 2014, NCES 2016-006. National Center for Education Statistics, Table, 325, 47.
Tao, L. & Zhang, M. (2013). Understanding an Online Classroom System: Design and Implementation Based on a Model Blending Pedagogy and HCI. IEEE Transactions on Human-Machine Systems, 43(5), 465-478.
Thelwall, M. (2000). Computer-based assessment: A versatile educational tool. Computers & Education, 34(1), 37–49.
Tian, T., & DeMara, R. F. (2018). Matrix-organized instructional delivery for scaling-up problem-based learning through reallocation of instructional support. In Proceedings of American Society for Engineering Education Southeastern Conference, Daytona: Beach, FL.
Tian, T., DeMara, R. F. & Gao, S. (2018). Lockdown computerized testing interwoven with rapid remediation: A crossover study within a mechanical engineering Core course. In Proceedings of IEEE Frontiers in Education Conference, San Jose, CA.
Vajravelu, K., & Muhs, T. (2016). Integration of digital technology and innovative strategies for learning and teaching large classes: A calculus case study. International Journal of Research In Education and Science (IJRES), 2(2), 379–395 Retrieved from http://files.eric.ed.gov/fulltext/EJ1105125.pdf.
Vygotsky, L. S. (1978). Mind in society: The development of higher mental process (pp. 130-133). Cambridge, MA: Harvard University Press.
Yuan, Z., Zhang, L., & Zhan, G. (2003). A novel web-based online examination system for computer science education. Paper presented at the 33rd ASEE/IEEE Frontiers in Education Conference.
Zhenming, Y, Liang, Z, & Guohua, Z. (2003). A novel web-based online examination system for computer science education.33rd ASEE/IEEE Frontiers in Education Conference, Boulder, CO.
Acknowledgements
The authors acknowledge the facilities, equipment, and support of the UCF College of Engineering and Computer Science, and the State University System of Florida’s Information Technology Program Performance Initiative.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
DeMara, R.F., Tian, T. & Howard, W. Engineering assessment strata: A layered approach to evaluation spanning Bloom’s taxonomy of learning. Educ Inf Technol 24, 1147–1171 (2019). https://doi.org/10.1007/s10639-018-9812-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10639-018-9812-5