skip to main content
10.1145/3626641.3626938acmotherconferencesArticle/Chapter ViewAbstractPublication PagessietConference Proceedingsconference-collections
research-article

Engagement Classification in E-Learning Environments Through Eye Features Using Scaled Dot-Product Attention Convolutional Neural Network

Published: 27 December 2023 Publication History

Abstract

Engagement can be defined as how individuals are involved and interact with a task that requires their attention and emotional state. Analyzing a student’s emotional reactions to four affective states, namely boredom, engagement, confusion, and frustration in e-learning environment can identify their learning behavior. By measuring student engagement and implementing the right prevention, such as rearranging the learning design or building an effective feedback system, students’ learning intake can be increased. A person’s actions and intentions can be identified using eye movement without knowing what the person is looking at. Therefore, this study utilized eye landmarks and gaze features extracted with OpenFace as the input to a Convolutional Neural Network (CNN)-based classifier that enhances the state-of-the-art model with the Scaled Dot-Product Attention mechanism to capture important part of the data. The performance of our model was compared with the state-of-the-art model and machine learning models on three different landmark points amounts, namely 56, 20, and 10 landmark points. The proposed model obtained accuracies of 96.89% on students’ affective states classification using 20 eye landmark points and 95.81% on engagement levels classification using 56 eye landmark points, while the Support Vector Machine obtained 81.4% and 78.73% accuracy on both objectives respectively, and the Dense Neural Network obtained 92.66% and 93.65% accuracy on both objectives respectively. It can be concluded that adding an attention mechanism can boost the model’s performance. Besides, similar accuracy was obtained using eye landmarks of 56 points and 20 points, but reduced more significantly on 10 points.

References

[1]
Khawlah Altuwairqi, Salma Kammoun Jarraya, Arwa Allinjawi, and Mohamed Hammami. 2021. A new emotion–based affective model to detect student’s engagement. Journal of King Saud University - Computer and Information Sciences 33, 1 (2021), 99–109. https://doi.org/10.1016/j.jksuci.2018.12.008
[2]
Tadas Baltrusaitis, Amir Zadeh, Yao Chong Lim, and Louis-Philippe Morency. 2018. OpenFace 2.0: Facial Behavior Analysis Toolkit. In 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018). IEEE, 59–66. https://doi.org/10.1109/FG.2018.00019
[3]
Saswat Dash, M. Ali Akber Dewan, Mahbub Murshed, Fuhua Lin, M. Abdullah-Al-Wadud, and Animesh Das. 2019. A Two-Stage Algorithm for Engagement Detection in Online Learning. In 2019 International Conference on Sustainable Technologies for Industry 4.0 (STI). IEEE, 1–4. https://doi.org/10.1109/STI47673.2019.9068054
[4]
P Dileep, Dibyaiyoti Das, and Prabin Kumar Bora. 2020. Dense Layer Dropout Based CNN Architecture for Automatic Modulation Classification. In 2020 National Conference on Communications (NCC). IEEE, 1–5. https://doi.org/10.1109/NCC48643.2020.9055989
[5]
Jennifer A Fredricks, Phyllis C Blumenfeld, and Alison H Paris. 2004. School Engagement: Potential of the Concept, State of the Evidence. Review of Educational Research 74, 1 (2004), 59–109. https://doi.org/10.3102/00346543074001059 arXiv:https://doi.org/10.3102/00346543074001059
[6]
Abhay Gupta, Richik Jaiswal, Sagar Adhikari, and Vineeth Balasubramanian. 2016. DAISEE: Dataset for Affective States in E-Learning Environments. CoRR abs/1609.01885 (2016). arXiv:1609.01885http://arxiv.org/abs/1609.01885
[7]
Ignace TC Hooge, Gijs A Holleman, Nina C Haukes, and Roy S Hessels. 2019. Gaze tracking accuracy in humans: One eye is sometimes better than two. Behavior Research Methods 51, 6 (2019), 2712–2721.
[8]
Min Hu, Hao Li, Wenping Deng, and Hua Guan. 2016. Student Engagement: One of the Necessary Conditions for Online Learning. In 2016 International Conference on Educational Innovation through Technology (EITT). IEEE, 122–126. https://doi.org/10.1109/EITT.2016.31
[9]
Yan Hu, Zeting Jiang, and Kaicheng Zhu. 2022. An Optimized CNN Model for Engagement Recognition in an E-Learning Environment. Applied Sciences 12, 16 (2022). https://doi.org/10.3390/app12168007
[10]
Tao Huang, Yunshan Mei, Hao Zhang, Sanya Liu, and Huali Yang. 2019. Fine-grained Engagement Recognition in Online Learning Environment. In 2019 IEEE 9th International Conference on Electronics Information and Emergency Communication (ICEIEC). IEEE, 338–341. https://doi.org/10.1109/ICEIEC.2019.8784559
[11]
Kalyan Kumar Jena, Sourav Kumar Bhoi, Tushar Kanta Malik, Kshira Sagar Sahoo, N Z Jhanjhi, Sajal Bhatia, and Fathi Amsaad. 2023. E-Learning Course Recommender System Using Collaborative Filtering Models. Electronics 12, 1 (2023). https://doi.org/10.3390/electronics12010157
[12]
Amanjot Kaur. 2018. Attention Network for Engagement Prediction in the Wild. In Proceedings of the 20th ACM International Conference on Multimodal Interaction (Boulder, CO, USA) (ICMI ’18). Association for Computing Machinery, New York, NY, USA, 516–519. https://doi.org/10.1145/3242969.3264972
[13]
Kusuma Ayu Laksitowening, Amarilis Putri Yanuarifiani, and Yanuar Firdaus Arie Wibowo. 2016. Enhancing e-learning system to support learning style based personalization. In 2016 2nd International Conference on Science in Information Technology (ICSITech). IEEE, 329–333. https://doi.org/10.1109/ICSITech.2016.7852657
[14]
Yan-Ying Li and Yi-Ping Hung. 2019. Feature Fusion of Face and Body for Engagement Intensity Detection. In 2019 IEEE International Conference on Image Processing (ICIP). IEEE, 3312–3316. https://doi.org/10.1109/ICIP.2019.8803488
[15]
Minh-Thang Luong, Hieu Pham, and Christopher D Manning. 2015. Effective Approaches to Attention-based Neural Machine Translation. (2015), 1412–1421.
[16]
Sandeep Mandia, Kuldeep Singh, and Rajendra Mitharwal. 2022. Vision Transformer for Automatic Student Engagement Estimation. In 2022 IEEE 5th International Conference on Image Processing Applications and Systems (IPAS), Vol. Five. IEEE, 1–6. https://doi.org/10.1109/IPAS55744.2022.10052945
[17]
Hamed Monkaresi, Nigel Bosch, Rafael A. Calvo, and Sidney K. D’Mello. 2017. Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate. IEEE Transactions on Affective Computing 8, 1 (2017), 15–28. https://doi.org/10.1109/TAFFC.2016.2515084
[18]
Shofiyati Nur Karimah, Teruhiko Unoki, and Shinobu Hasegawa. 2021. Implementation of Long Short-Term Memory (LSTM) Models for Engagement Estimation in Online Learning. In 2021 IEEE International Conference on Engineering, Technology & Education (TALE). IEEE, 283–289. https://doi.org/10.1109/TALE52509.2021.9678909
[19]
Aulia Nurrahma Rosanti Paidja and Fitra A. Bachtiar. 2022. Engagement Emotion Classification through Facial Landmark Using Convolutional Neural Network. In 2022 2nd International Conference on Information Technology and Education (ICIT&E). IEEE, 234–239. https://doi.org/10.1109/ICITE54466.2022.9759546
[20]
Rashmi Adyapady R. and Annappa B.2022. Learning Engagement Assessment in MOOC Scenario. In 2022 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT). IEEE, 1–6. https://doi.org/10.1109/CONECCT55679.2022.9865699
[21]
S. Tu. 2020. ENGAGEMENT PREDICTION AND VISUALIZATION IN ONLINELEARNING. In Jiangsu Annual Conference on Automation (JACA 2020), Vol. 2020. IET, 57–62. https://doi.org/10.1049/icp.2021.1428
[22]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is All You Need. In Proceedings of the 31st International Conference on Neural Information Processing Systems (Long Beach, California, USA) (NIPS’17). Curran Associates Inc., Red Hook, NY, USA, 6000–6010.
[23]
Jiayue Yi, Bin Sheng, Ruimin Shen, Weiyao Lin, and Enhua Wu. 2015. Real Time Learning Evaluation Based on Gaze Tracking. In 2015 14th International Conference on Computer-Aided Design and Computer Graphics (CAD/Graphics). IEEE, 157–164. https://doi.org/10.1109/CADGRAPHICS.2015.13
[24]
Jianhua Zhang, Zhong Yin, Peng Chen, and Stefano Nichele. 2020. Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Information Fusion 59 (2020), 103–126. https://doi.org/10.1016/j.inffus.2020.01.011

Index Terms

  1. Engagement Classification in E-Learning Environments Through Eye Features Using Scaled Dot-Product Attention Convolutional Neural Network

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      SIET '23: Proceedings of the 8th International Conference on Sustainable Information Engineering and Technology
      October 2023
      722 pages
      ISBN:9798400708503
      DOI:10.1145/3626641
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 27 December 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Convolutional Neural Network
      2. affective states
      3. attention
      4. classification
      5. engagement

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      SIET 2023

      Acceptance Rates

      Overall Acceptance Rate 45 of 57 submissions, 79%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 39
        Total Downloads
      • Downloads (Last 12 months)32
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 05 Mar 2025

      Other Metrics

      Citations

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media