skip to main content
10.1145/3641181.3641186acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiccdeConference Proceedingsconference-collections
research-article

Revisiting Annotations in Online Student Engagement

Published: 11 April 2024 Publication History

Abstract

Assessing the engagement of students in online classroom is crucial to meet their learning objectives. Many machine learning and deep learning models have been proposed to handle this problem using a variety of sensors, with videos cameras being the most prominent. However, most of these approaches are not interoperable because different datasets use different labeling protocols. As a result, the classification models range from binary, multi-class to regression problems. Another problem is the lack of rigor and definition of engagement to annotate the data. In this paper, firstly we showed inconsistencies in the labeling of a popular student engagement DAiSEE dataset. Then, we re-labeled more than 7000 videos of this dataset using a methodical engagement annotation protocol, HELP, to convert it from four class to binary classification problem. Further analysis highlights issues in DAiSEE annotation in comparison to the HELP protocol. Lastly, we tested three state-of-the-art deep learning and feature-based methods and discussed their performance. Data imbalance in the newly and previously annotated data was found to be the main issue in developing predictive models.

References

[1]
J. A. Gray and M. Diloreto, “The Effects of Student Engagement, Student Satisfaction, and Perceived Learning in Online Learning Environments,” NCPEA International Journal of Educational Leadership Preparation, vol. 11, no. 1, 2016.
[2]
“HeatherrO'Brienn· PaullCairns Editors Why Engagement Matters Cross-Disciplinary Perspectives of User Engagement in Digital Media.”
[3]
B. J. Venton and R. R. Pompano, “Strategies for enhancing remote student engagement through active learning,” Analytical and Bioanalytical Chemistry, vol. 413, no. 6. Springer Science and Business Media Deutschland GmbH, pp. 1507–1512, Mar. 01, 2021.
[4]
C. Busch, “Supervised training with wireless monitoring of ECG, blood pressure and oxygen-saturation in cardiac patients,” J Telemed Telecare, vol. 15, no. 3, pp. 112–114, Apr. 2009.
[5]
J. Medina Quero, M. R. Fernández Olmo, M. D. Peláez Aguilera, and M. Espinilla Estévez, “Real-Time Monitoring in Home-Based Cardiac Rehabilitation Using Wrist-Worn Heart Rate Devices,” Sensors 2017, Vol. 17, Page 2892, vol. 17, no. 12, p. 2892, Dec. 2017.
[6]
E. Piotrowicz, A. Jasionowska, M. Banaszak-Bednarczyk, J. Gwilkowska, and R. Piotrowicz, “ECG telemonitoring during home-based cardiac rehabilitation in heart failure patients,” J Telemed Telecare, vol. 18, no. 4, pp. 193–197, Jun. 2012.
[7]
A. Abedi and S. Khan, “Affect-driven Engagement Measurement from Videos,” Jun. 2021, [Online]. Available: http://arxiv.org/abs/2106.10882
[8]
S. S. Khan, A. Abedi, and T. J. Colella, “Inconsistencies in Measuring Student Engagement in Virtual Learning-A Critical Review,” 2015.
[9]
A. Gupta, A. D'Cunha, K. Awasthi, and V. Balasubramanian, “DAiSEE: Towards User Engagement Recognition in the Wild,” Sep. 2016, [Online]. Available: http://arxiv.org/abs/1609.01885
[10]
S. Aslan, “Human Expert Labeling Process (HELP): Towards a Reliable Higher-Order User State Labeling Process and Tool to Assess Student Engagement.”
[11]
M. A. A. Dewan, M. Murshed, and F. Lin, “Engagement detection in online learning: a review,” Smart Learning Environments, vol. 6, no. 1, Dec. 2019.
[12]
Y.-W. Hao, A. Lee-Post, K. Andrea Lobos, and Y. Li, “TYPE Systematic Review PUBLISHED December rrrr DOI OPEN ACCESS EDITED BY Learning engagement in massive open online courses: A systematic review.”
[13]
S. D'Mello, E. Dieterle, and A. Duckworth, “Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning,” Educ Psychol, vol. 52, no. 2, pp. 104–123, Apr. 2017.
[14]
J. Whitehill, Z. Serpell, Y.-C. Lin, A. Foster, and J. R. Movellan, “TRANSACTIONS ON AFFECTIVE COMPUTING The Faces of Engagement: Automatic Recognition of Student Engagement from Facial Expressions.”
[15]
N. Bosch, “Automatic detection of learning-centered affective states in the wild,” in International Conference on Intelligent User Interfaces, Proceedings IUI, Association for Computing Machinery, Mar. 2015, pp. 379–388.
[16]
N. Alyuz, S. Aslan, S. K. D'Mello, L. Nachman, and A. A. Esme, “Annotating Student Engagement Across Grades 1–12: Associations with Demographics and Expressivity,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Science and Business Media Deutschland GmbH, 2021, pp. 42–51.
[17]
E. Okur, N. Alyuz, S. Aslan, U. Genc, C. Tanriover, and A. A. Esme, “Behavioral engagement detection of students in the wild,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Verlag, 2017, pp. 250–261.
[18]
N. Alyuz, E. Okur, U. Genc, S. Aslan, C. Tanriover, and A. A. Esme, “An unobtrusive and multimodal approach for behavioral engagement detection of students,” in MIE 2017 - Proceedings of the 1st ACM SIGCHI International Workshop on Multimodal Interaction for Education, Co-located with ICMI 2017, Association for Computing Machinery, Inc, Nov. 2017, pp. 26–32.
[19]
S. Nawaz, G. Kennedy, J. Bailey, and C. Mead, “Moments of confusion in simulation-based learning environments,” Journal of Learning Analytics, vol. 7, no. 3, pp. 118–137, 2020.
[20]
A. Abedi and S. S. Khan, “Improving state-of-the-art in Detecting Student Engagement with Resnet and TCN Hybrid Network,” in The 18th Conference on Robots and Vision, 2021.
[21]
A. Abedi and S. S. Khan, “Affect-driven ordinal engagement measurement from video,” Multimed Tools Appl, Aug. 2023.
[22]
A. Toisoul, J. Kossaifi, A. Bulat, G. Tzimiropoulos, and M. Pantic, “Estimation of continuous valence and arousal levels from faces in naturalistic conditions,” Nat Mach Intell, vol. 3, no. 1, pp. 42–50, Jan. 2021.
[23]
IEEE Computer Society, IEEE Biometrics Council, and Institute of Electrical and Electronics Engineers, 13th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition: FG 2018: proceedings: 15-19 May 2018, Xi'an China.
[24]
A. Mollahosseini, B. Hasani, and M. H. Mahoor, “AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild,” Aug. 2017.
[25]
P. Khosla, “Supervised contrastive learning,” in Advances in Neural Information Processing Systems, 2020.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICCDE '24: Proceedings of the 2024 10th International Conference on Computing and Data Engineering
January 2024
157 pages
ISBN:9798400709319
DOI:10.1145/3641181
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 11 April 2024

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

ICCDE 2024

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 32
    Total Downloads
  • Downloads (Last 12 months)32
  • Downloads (Last 6 weeks)3
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media