Skip to main content
Log in

Enhancing frame-level student engagement classification through knowledge transfer techniques

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Assessing student engagement in educational settings is critical for monitoring and improving the learning process. Traditional methods that classify video-based student engagement datasets often assign a single engagement label to the entire video, resulting in inaccurate classification outcomes. However, student engagement varies over time, with fluctuations in concentration and interest levels. To overcome this limitation, this paper introduces a frame-level student engagement detection approach. By analyzing each frame individually, instructors gain more detailed insights into students’ understanding of the course. The proposed method focuses on identifying student engagement at a granular level, enabling instructors to pinpoint specific moments of disengagement or high engagement for targeted interventions. Nevertheless, the lack of labeled frame-level engagement data presents a significant challenge. To address this challenge, we propose a novel approach for frame-level student engagement classification by leveraging the concept of knowledge transfer. Our method involves pretraining a deep learning model on a labeled image-based student engagement dataset, WACV, which serves as the base dataset for identifying frame-level engagement in our target video-based DAiSEE dataset. We then fine-tune the model on the unlabeled video dataset, utilizing the transferred knowledge to enhance engagement classification performance. Experimental results demonstrate the effectiveness of our frame-level approach, providing valuable insights for instructors to optimize instructional strategies and enhance the learning experience. This research contributes to the advancement of student engagement assessment, offering educators a more nuanced understanding of student behaviors during instructional videos.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Fig. 5

Similar content being viewed by others

Availability of data and materials

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Notes

  1. In the spirit of reproducible research, the codes and dataset to reproduce the results of this paper can be found here: https://github.com/rijju-das/Frame-level-student-engagement.

References

  1. Christenson S, Reschly AL, Wylie C et al (2012) Handbook of research on student engagement, vol 840. Springer, ???

  2. Doherty K, Doherty G (2018) Engagement in hci: conception, theory and measurement. ACM Comput Surv (CSUR) 51(5):1–39

    Article  Google Scholar 

  3. Liu T, Wang J, Yang B, Wang X (2021) Facial expression recognition method with multi-label distribution learning for non-verbal behavior understanding in the classroom. Infrared Physics & Technology 112:103594

    Article  Google Scholar 

  4. Zhang Z, Li Z, Liu H, Cao T, Liu S (2020) Data-driven online learning engagement detection via facial expression and mouse behavior recognition technology. J Educ Comput Res 58(1):63–86

    Article  Google Scholar 

  5. Dewan M, Murshed M, Lin F (2019) Engagement detection in online learning: a review. Smart Learning Environments 6(1):1–20

    Article  Google Scholar 

  6. Karimah SN, Hasegawa S (2021) Automatic engagement recognition for distance learning systems: a literature study of engagement datasets and methods. In: International conference on human-computer interaction. Springer, pp 264–276

  7. Ekman P, Friesen WV (1978) Facial action coding system. Environmental Psychology & Nonverbal Behavior

  8. Velusamy S, Kannan H, Anand B, Sharma A, Navathe B (2011) A method to infer emotions from facial action units. In: 2011 IEEE International conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 2028–2031

  9. Khorrami P, Paine T, Huang T (2015) Do deep neural networks learn facial action units when doing expression recognition? In: Proceedings of the IEEE international conference on computer vision workshops, pp 19–27

  10. Huang W, Yang Y, Huang X, Peng Z, Xiong L (2022) Emotion-cause pair extraction based on interactive attention. Appl Intell, 1–11

  11. Fredricks JA, Reschly AL, Christenson SL (2019) Interventions for student engagement: overview and state of the field. Handbook of student engagement interventions, 1–11

  12. Bhardwaj P, Gupta P, Panwar H, Siddiqui MK, Morales-Menendez R, Bhaik A (2021) Application of deep learning on student engagement in e-learning environments. Comput Electr Eng 93:107277

    Article  PubMed  PubMed Central  Google Scholar 

  13. Kaur A, Mustafa A, Mehta L, Dhall A (2018) Prediction and localization of student engagement in the wild. In: 2018 Digital image computing: techniques and applications (DICTA). IEEE, pp 1–8

  14. Mohamad Nezami O, Dras M, Hamey L, Richards D, Wan S, Paris C (2020) Automatic recognition of student engagement using deep learning and facial expression. In: Joint European conference on machine learning and knowledge discovery in databases. Springer, pp 273–289

  15. Whitehill J, Serpell Z, Lin Y-C, Foster A, Movellan JR (2014) The faces of engagement: automatic recognition of student engagementfrom facial expressions. IEEE Trans Affect Comput 5(1):86–98

    Article  Google Scholar 

  16. Batra S, Wang H, Nag A, Brodeur P, Checkley M, Klinkert A, Dev S (2022) Dmcnet: diversified model combination network for understanding engagement from video screengrabs. Systems and Soft Computing 4:200039

    Article  Google Scholar 

  17. Abedi A, Khan SS (2021) Improving state-of-the-art in detecting student engagement with resnet and tcn hybrid network. In: 2021 18th Conference on robots and vision (CRV). IEEE, pp 151–157

  18. Mehta NK, Prasad SS, Saurav S, Saini R, Singh S (2022) Three-dimensional densenet self-attention neural network for automatic detection of student’s engagement. Appl Intell 52(12):13803–13823

    Article  Google Scholar 

  19. Thomas C, Sarma KP, Gajula SS, Jayagopi DB (2022) Automatic prediction of presentation style and student engagement from videos. Computers and Education: Artif Intell 3:100079

    Google Scholar 

  20. Karimah SN, Hasegawa S (2022) Automatic engagement estimation in smart education/learning settings: a systematic review of engagement definitions, datasets, and methods. Smart Learning Environments 9(1):1–48

    Article  Google Scholar 

  21. Yun W-H, Lee D, Park C, Kim J, Kim J (2018) Automatic recognition of children engagement from facial video using convolutional neural networks. IEEE Trans Affect Comput 11(4):696–707

    Article  Google Scholar 

  22. Wang X, Liu T, Wang J, Tian J (2022) Understanding learner continuance intention: a comparison of live video learning, pre-recorded video learning and hybrid video learning in covid-19 pandemic. Int J Hum Comput Interact 38(3):263–281

    Article  Google Scholar 

  23. Liu T, Wang J, Yang B, Wang X (2021) Ngdnet: nonuniform gaussian-label distribution learning for infrared head pose estimation and on-task behavior understanding in the classroom. Neurocomputing 436:210–220

    Article  Google Scholar 

  24. Lu X, Wang W, Ma C, Shen J, Shao L, Porikli F (2019) See more, know more: Unsupervised video object segmentation with co-attention siamese networks. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 3623–3632

  25. Qin Z, Lu X, Nie X, Liu D, Yin Y, Wang W (2023) Coarse-to-fine video instance segmentation with factorized conditional appearance flows. IEEE/CAA Journal of Automatica Sinica 10(5):1192–1208

    Article  Google Scholar 

  26. Lu X, Wang W, Shen J, Crandall DJ, Van Gool L (2021) Segmenting objects from relational visual data. IEEE Trans Pattern Anal Mach Intell 44(11):7885–7897

    Article  Google Scholar 

  27. Gupta A, D’Cunha A, Awasthi K, Balasubramanian V (2016) Daisee: towards user engagement recognition in the wild. arXiv preprint arXiv:1609.01885

  28. Liao J, Liang Y, Pan J (2021) Deep facial spatiotemporal network for engagement prediction in online learning. Appl Intell 51:6609–6621

    Article  Google Scholar 

  29. Selim T, Elkabani I, Abdou MA (2022) Students engagement level detection in online e-learning using hybrid efficientnetb7 together with tcn, lstm, and bi-lstm. IEEE Access 10:99573–99583

    Article  Google Scholar 

  30. Hu Y, Jiang Z, Zhu K (2022) An optimized cnn model for engagement recognition in an e-learning environment. Appl Sci 12(16):8007

    Article  CAS  Google Scholar 

  31. Booth BM, Ali AM, Narayanan SS, Bennett I, Farag AA (2017) Toward active and unobtrusive engagement assessment of distance learners. In: 2017 Seventh international conference on affective computing and intelligent interaction (ACII). IEEE, pp 470–476

  32. Chen X, Niu L, Veeraraghavan A, Sabharwal A (2019) Faceengage: robust estimation of gameplay engagement from user-contributed (youtube) videos. IEEE Trans Affect Comput 13(2):651–665

    Article  Google Scholar 

  33. Abedi A, Thomas C, Jayagopi DB, Khan SS (2023) Bag of states: a non-sequential approach to video-based engagement measurement. arXiv preprint arXiv:2301.06730

  34. Copur O, Nakıp M, Scardapane S, Slowack J (2022) Engagement detection with multi-task training in e-learning environments. In: Image analysis and processing-ICIAP 2022: 21st International conference, Lecce, Italy, proceedings, Part III. Springer, pp 411–422. Accessed 23-27 May 2022

  35. Abedi A, Khan SS (2023) Affect-driven ordinal engagement measurement from video. Multimedia Tools and Applications, 1–20

  36. Khan SS, Colella TJ: Inconsistencies in measuring user engagement in virtual learning–a critical

  37. De Carolis B, D’Errico F, Macchiarulo N, Palestra G (2019) “engaged faces”: measuring and monitoring student engagement from face and gaze behavior. In: IEEE/WIC/ACM International conference on web intelligence-companion volume, pp 80–85

  38. D’Mello S, Graesser A (2012) Dynamics of affective states during complex learning. Learn Instr 22(2):145–157

    Article  Google Scholar 

  39. Baker RSd, Rodrigo MMT, Xolocotzin UE (2007) The dynamics of affective transitions in simulation problem-solving environments. In: Affective computing and intelligent interaction: second inter- national conference, ACII 2007 Lisbon, Portugal, Proceedings 2. Springer, pp 666–677. Accessed 12-14 Sept 2007

  40. Baltrusaitis T, Zadeh A, Lim YC, Morency L-P (2018) Openface 2.0: facial behavior analysis toolkit. In: 2018 13th IEEE International conference on automatic face & gesture recognition (FG 2018). IEEE, pp 59–66

  41. Affectiva (2022) Humanizing technology. https://www.affectiva. com/. Accessed 25 May 2023

  42. Software F (2007) Facial expression recognition software: Fac- eReader. https://www.noldus.com/. Accessed 25 May 2023

  43. Buono P, De Carolis B, D’Errico F, Macchiarulo N, Palestra G (2023) Assessing student engagement from facial behavior in on-line learning. Multimedia Tools and Applications 82(9):12859–12877

    Article  PubMed  Google Scholar 

  44. Alkabbany I, Ali A, Farag A, Bennett I, Ghanoum M, Farag A (2019) Measuring student engagement level using facial information. In: 2019 IEEE International conference on image processing (ICIP). IEEE, pp 3337–3341

  45. Thomas C, Jayagopi DB (2017) Predicting student engagement in classrooms using facial behavioral cues. In: Proceedings of the 1st ACM SIGCHI International workshop on multimodal interaction for education, pp 33–40

  46. Das R, Dev S (2023) On facial feature extraction for engagement recognition. Signal Processing: Image Communication (Under review)

  47. Chen T, Guestrin C (2016) Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd international conference on knowledge discovery and data mining, pp 785–794

  48. Upadhyay H, Kamat Y, Phansekar S, Hole V (2021) User engagement recognition using transfer learning and multi-task classification. In: Intelligent data communication technologies and internet of things: proceedings of ICICI 2020. Springer, pp 411–420

  49. Karan K, Bahel V, Ranjana R, Subha T (2022) Transfer learning approach for analyzing attentiveness of students in an online classroom environment with emotion detection. In: Innovations in computational intelligence and computer vision: proceedings of ICICV 2021. Springer, ???, pp 253–261

  50. Zheng X, Hasegawa S, Tran M-T, Ota K, Unoki T (2021) Estimation of learners’ engagement using face and body features by transfer learning. In: International conference on human-computer interaction. Springer, pp 541–552

  51. Ikram S, Ahmad H, Mahmood N, Faisal CN, Abbas Q, Qureshi I, Hussain A (2023) Recognition of student engagement state in a classroom environment using deep and efficient transfer learning algorithm. Appl Sci 13(15):8637

    Article  CAS  Google Scholar 

  52. Bougourzi F, Dornaika F, Barrena N, Distante C, Taleb-Ahmed A (2022) Cnn based facial aesthetics analysis through dynamic robust losses and ensemble regression. Appl Intell, 1–18

  53. Kiranyaz S, Avci O, Abdeljaber O, Ince T, Gabbouj M, Inman DJ (2021) 1d convolutional neural networks and applications: a survey. Mech Syst Signal Process 151:107398

    Article  Google Scholar 

  54. Torrey L, Shavlik J (2010) Transfer learning. In: Handbook of research on machine learning applications and trends: algorithms, methods, and techniques. IGI global, ???, pp 242–264

  55. Yang Q, Zhang Y, Dai W, Pan SJ (2020) Transfer learning. Cambridge University Press, ???

  56. Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H, Xiong H, He Q (2020) A comprehensive survey on transfer learning. Proc IEEE 109(1):43–76

    Article  Google Scholar 

  57. Murshed M, Dewan MAA, Lin F, Wen D (2019) Engagement detection in e-learning environments using convolutional neural networks. In: 2019 IEEE Intl conf on dependable, autonomic and secure computing, Intl conf on pervasive intelligence and computing, Intl conf on cloud and big data computing, Intl conf on cyber science and technology congress (DASC/PiCom/CBDCom/CyberSciTech). IEEE, pp 80–86

Download references

Acknowledgements

This research was conducted with the financial support of Science Foundation Ireland under Grant Agreement No. 13/RC/2106_P2 at the ADAPT SFI Research Centre at University College Dublin. ADAPT, the SFI Research Centre for AI-Driven Digital Content Technology is funded by Science Foundation Ireland through the SFI Research Centres Programme.

Author information

Authors and Affiliations

Authors

Contributions

R. Das conducted the experiments. R. Das and S. Dev wrote the manuscript text. All authors reviewed the manuscript.

Corresponding author

Correspondence to Soumyabrata Dev.

Ethics declarations

Competing interests

The authors declare that they have no conflict of interest.

Ethical Approval

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Das, R., Dev, S. Enhancing frame-level student engagement classification through knowledge transfer techniques. Appl Intell 54, 2261–2276 (2024). https://doi.org/10.1007/s10489-023-05256-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-05256-2

Keywords

Navigation