Skip to main content
Log in

Automatic student engagement measurement using machine learning techniques: A literature study of data and methods

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Student engagement is positively related to learning outcomes. The student engagement measurement is studied in varied settings, from the traditional classroom to online learning. Artificial intelligence and machine learning advancements have fueled automatic student engagement analysis. The automated student engagement measurement employed several sensor data such as audio, video, and physiological signals in different settings. This paper presents a literature review of automatic student engagement measurement in the classroom and online learning settings, including data collection and annotation techniques, methods, and evaluation metrics. First, a generalized methodology for automatic student engagement analysis is discussed. Then we describe various data collection techniques and annotation methods widely used in the literature and detail the limitations and advantages. The state-of-the-art machine learning methods and the evaluation metrics used to test those methods are reviewed. Additionally, we extend our literature review to the insight into the existing datasets for evaluating the automatic student engagement methods and recent developments in the machine learning methods on open-source datasets. Finally, we present a comprehensive comparison of the methods proposed on various public datasets based on evaluation metrics and engagement types.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Data availability

The datasets analysed during the current study are not publicly available but are available from the respective corresponding author on reasonable request.

Abbreviations

MOOCs:

Massive Open Online Courses

SOTA:

State-of-the-Art

BPMS:

Body Pressure Measurement System

DAiSEE:

Dataset for Affect States in E-Environment

DT:

Decision Tree

SVM:

Support Vector Machine

KNN:

K-Nearest Neighbors

LMS:

Learning Management System

PCA:

Principal Component Analysis

GBT:

Gradient Boosting Trees

LR:

Logistic Regression

AIC:

Akaike Information Criteria

HMM:

Hidden Markov Model

MLR:

Multinomial logistic regression

CERT:

Computer Expression Recognition Toolbox

2AFC:

2-Alternative Forced-Choice

ANU:

Animation Unit

LBP-TOP:

Local Binary Pattern in Three Orthogonal Planes

UNB:

Updateable Naïve Bayes

BN:

Bayes Nate

RF:

Random Forest

DL:

Deep Learning

SPOC:

Small Private Online Course

SURF:

Speeded Up Robust Features

BOF:

Bag of Features

YFD:

Yale Face Dataset

AAM:

Active Appearance Model

FDDB:

Face Detection Dataset and Benchmark

LFW:

Labeled Faces in the Wild

NN:

Neural Network

BROMP:

Baker Rodrigo Ocumpaugh Monitoring Protocol

MLP:

Multilayer perceptron

SVR:

Support Vector Regression

LSTM:

Long Short-Term Memory

CNN:

Convolutional Neural Network

YOLO:

You Only Look Once

FER:

Facial Expression Recognition

HBCU:

Historically Black College/university

EmotiW:

Emotion Recognition in the Wild

Bi-GRU:

Bidirectional Gated Recurrent Unit

MTCNN:

Multi-Task Cascaded Convolutional Neural Networks

DFSTN:

Deep Facial Spatiotemporal Network

DBN:

Deep Belief Network

LDP:

Local Directional Pattern

ML:

Machine Learning

SIFT:

Scale-Invariant Feature Transform

References

  1. Fredricks JA, Blumenfeld PC, Paris AH (2004) School engagement: Potential of the concept, state of the evidence. Rev Educ Res 74:59–109

    Article  Google Scholar 

  2. Hiver P, Al-Hoorie AH, Mercer S (eds) (2021) Student Engagement in the Language Classroom. Multilingual Matters. https://doi.org/10.21832/9781788923613

  3. Deeva G, De Smedt J, De Weerdt J (2022) Educational Sequence Mining for Dropout Prediction in MOOCs: Model Building, Evaluation, and Benchmarking. IEEE Trans Learn Technol 1–16. https://doi.org/10.1109/TLT.2022.3215598

  4. Palacios Hidalgo FJ, Huertas Abril CA, Gómez Parra ME (2020) MOOCs: Origins, Concept and Didactic Applications: A Systematic Review of the Literature (2012–2019). Technol Knowl Learn 25:853–879. https://doi.org/10.1007/s10758-019-09433-6

  5. Dyment J, Stone C, Milthorpe N (2020) Beyond busy work: rethinking the measurement of online student engagement. High Educ Res Dev 39:1440–1453. https://doi.org/10.1080/07294360.2020.1732879

    Article  Google Scholar 

  6. Fox A (2013) From MOOCs to SPOCs. Commun ACM 56:38–40. https://doi.org/10.1145/2535918

    Article  Google Scholar 

  7. Ruiz-Palmero J, Fernández-Lacorte J-M, Sánchez-Rivas E, Colomo-Magaña E (2020) The implementation of Small Private Online Courses (SPOC) as a new approach to education. Int J Educ Technol High Educ 17:27. https://doi.org/10.1186/s41239-020-00206-1

    Article  Google Scholar 

  8. Khedher AB, Jraidi I, Frasson C (2019) Tracking Students’ Mental Engagement Using EEG Signals during an Interaction with a Virtual Learning Environment. J Intell Learn Syst Appl 11:1–14. https://doi.org/10.4236/jilsa.2019.111001

    Article  Google Scholar 

  9. Aluja-Banet T, Sancho M-R, Vukic I (2019) Measuring motivation from the Virtual Learning Environment in secondary education. J Comput Sci 36:100629. https://doi.org/10.1016/j.jocs.2017.03.007

    Article  Google Scholar 

  10. Botelho AF, Varatharaj A, Patikorn T, Doherty D, Adjei SA, Beck JE (2019) Developing Early Detectors of Student Attrition and Wheel Spinning Using Deep Learning. IEEE Trans Learn Technol 12:158–170. https://doi.org/10.1109/TLT.2019.2912162

    Article  Google Scholar 

  11. Sumer O, Goldberg P, D’Mello S, Gerjets P, Trautwein U, Kasneci E (2021) Multimodal Engagement Analysis from Facial Videos in the Classroom. IEEE Trans Affect Comput 1–1. https://doi.org/10.1109/TAFFC.2021.3127692

  12. Bosch N, D’Mello SK (2021) Automatic Detection of Mind Wandering from Video in the Lab and in the Classroom. IEEE Trans Affect Comput 12:974–988. https://doi.org/10.1109/TAFFC.2019.2908837

    Article  Google Scholar 

  13. Whitehill J, Serpell Z, Lin Y-C, Foster A, Movellan JR (2014) The Faces of Engagement: Automatic Recognition of Student Engagementfrom Facial Expressions. IEEE Trans Affect Comput 5:86–98. https://doi.org/10.1109/TAFFC.2014.2316163

    Article  Google Scholar 

  14. Ashwin TS, Guddeti RMR (2019) Unobtrusive Behavioral Analysis of Students in Classroom Environment Using Non-Verbal Cues. IEEE Access 7:150693–150709. https://doi.org/10.1109/ACCESS.2019.2947519

    Article  Google Scholar 

  15. Halabi O (2020) Immersive virtual reality to enforce teaching in engineering education. Multimed Tools Appl 79:2987–3004. https://doi.org/10.1007/s11042-019-08214-8

    Article  Google Scholar 

  16. Skinner EA, Belmont MJ (1993) Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. J Educ Psychol 85:571

  17. Thomas C, Jayagopi DB (2017) Predicting student engagement in classrooms using facial behavioral cues. In: Proceedings of the 1st ACM SIGCHI international workshop on multimodal interaction for education, pp 33–40

    Chapter  Google Scholar 

  18. Dewan MAA, Murshed M, Lin F (2019) Engagement detection in online learning: a review. Smart Learn Environ 6:1. https://doi.org/10.1186/s40561-018-0080-z

    Article  Google Scholar 

  19. Liu Y, Chen J, Zhang M, Rao C (2018) Student engagement study based on multi-cue detection and recognition in an intelligent learning environment. Multimed Tools Appl 77:28749–28775. https://doi.org/10.1007/s11042-018-6017-2

    Article  Google Scholar 

  20. Farhan M, Aslam M, Jabbar S, Khalid S (2018) Multimedia based qualitative assessment methodology in eLearning: student teacher engagement analysis. Multimed Tools Appl 77:4909–4923. https://doi.org/10.1007/s11042-016-4212-6

    Article  Google Scholar 

  21. Mohamad Nezami O, Dras M, Hamey L, Richards D, Wan S, Paris C (2020) Automatic recognition of student engagement using deep learning and facial expression. In: Joint European conference on machine learning and knowledge discovery in databases. Springer, Cham, pp 273–289

    Chapter  Google Scholar 

  22. Damiano R, Lombardo V, Monticone G, Pizzo A (2021) Studying and designing emotions in live interactions with the audience. Multimed Tools Appl 80:6711–6736

    Article  Google Scholar 

  23. Singh A, Karanam S, Kumar D (2013) Constructive Learning for Human-Robot Interaction. IEEE Potentials 32:13–19. https://doi.org/10.1109/MPOT.2012.2189443

    Article  Google Scholar 

  24. Abdul Hamid SS, Admodisastro N, Manshor N, Kamaruddin A, Ghani AAA (2018) Dyslexia Adaptive Learning Model: Student Engagement Prediction Using Machine Learning Approach. In: Ghazali R, Deris MM, Nawi NM, Abawajy JH (eds) Recent Adv. Soft Comput. Data Min., Springer International Publishing, Cham, pp. 372–384. https://doi.org/10.1007/978-3-319-72550-5_36

  25. Kumar V, Dhingra G, Saxena N, Malhotra R (2022) Machine learning based analysis of learner-centric teaching of punjabi grammar with multimedia tools in rural indian environment. Multimed Tools Appl. https://doi.org/10.1007/s11042-022-12898-w

    Article  Google Scholar 

  26. Sidney KD, Craig SD, Gholson B, Franklin S, Picard R, Graesser AC (2005) Integrating affect sensors in an intelligent tutoring system. In: Affective interactions: the computer in the affective loop workshop, pp 7–13

    Google Scholar 

  27. Zaletelj J, Košir A (2017) Predicting students’ attention in the classroom from Kinect facial and body features. EURASIP J Image Video Process 2017:80. https://doi.org/10.1186/s13640-017-0228-8

    Article  Google Scholar 

  28. Monkaresi H, Bosch N, Calvo RA, D’Mello SK (2017) Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate. IEEE Trans Affect Comput 8:15–28. https://doi.org/10.1109/TAFFC.2016.2515084

    Article  Google Scholar 

  29. Yun W-H, Lee D, Park C, Kim J, Kim J (2019) Automatic Recognition of Children Engagement from Facial Video using Convolutional Neural Networks. IEEE Trans Affect Comput 1–1. https://doi.org/10.1109/TAFFC.2018.2834350

  30. Huang, Mei Y, Zhang H, Liu S, Yang H (2019) Fine-grained Engagement Recognition in Online Learning Environment. In: 2019 IEEE 9th Int. Conf. Electron. Inf. Emerg. Commun. ICEIEC, IEEE, Beijing, China, pp. 338–341.https://doi.org/10.1109/ICEIEC.2019.8784559

  31. Tiam-Lee TJ, Sumi K (2019) Analysis and Prediction of Student Emotions While Doing Programming Exercises. In: Coy A, Hayashi Y, Chang M (eds) Intell. Tutoring Syst., Springer International Publishing, Cham, pp. 24–33. https://doi.org/10.1007/978-3-030-22244-4_4

  32. Booth BM, Ali AM, Narayanan SS, Bennett I, Farag AA (2017) Toward active and unobtrusive engagement assessment of distance learners. In: 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII). IEEE, pp 470–476

    Chapter  Google Scholar 

  33. Valdez MG, Hernández-Águila A, Guervós JJM, Soto AM (2017) Enhancing student engagement via reduction of frustration with programming assignments using machine learning. In: IJCCI, pp 297–304

    Google Scholar 

  34. Hew KF, Qiao C, Tang Y (2018) Understanding student engagement in large-scale open online courses: A machine learning facilitated analysis of student’s reflections in 18 highly rated MOOCs. International Review of Research in Open and Distributed Learning (IRRODL) 19(3)

  35. Moubayed A, Injadat M, Shami A, Lutfiyya H (2020) Student Engagement Level in an e-Learning Environment: Clustering Using K-means. Am J Dist Educ 34:137–156. https://doi.org/10.1080/08923647.2020.1696140

    Article  Google Scholar 

  36. Liu M, Calvo RA, Pardo A, Martin A (2015) Measuring and Visualizing Students’ Behavioral Engagement in Writing Activities. IEEE Trans Learn Technol 8:215–224. https://doi.org/10.1109/TLT.2014.2378786

    Article  Google Scholar 

  37. Kim Y, Soyata T, Behnagh RF (2018) Towards Emotionally Aware AI Smart Classroom: Current Issues and Directions for Engineering and Education. IEEE Access 6:5308–5331. https://doi.org/10.1109/ACCESS.2018.2791861

    Article  Google Scholar 

  38. Pattanasri N, Mukunoki M, Minoh M (2012) Learning to Estimate Slide Comprehension in Classrooms with Support Vector Machines. IEEE Trans Learn Technol 5:52–61. https://doi.org/10.1109/TLT.2011.22

    Article  Google Scholar 

  39. Ashwin TS, Jose J, Raghu G, Reddy GRM (2015) An e-learning system with multifacial emotion recognition using supervised machine learning. In: 2015 IEEE seventh international conference on technology for education (T4E). IEEE, pp 23–26

    Chapter  Google Scholar 

  40. Lee H, Jung J, Lee HK, Yang HS (2021) Discipline vs guidance: comparison of visual engagement approaches in immersive virtual environments. Multimed Tools Appl 80:31239–31261

    Article  Google Scholar 

  41. Balaam M, Fitzpatrick G, Good J, Luckin R (2010) Exploring affective technologies for the classroom with the subtle stone. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 1623–1632

    Chapter  Google Scholar 

  42. Bian C, Zhang Y, Yang F, Bi W, Lu W (2019) Spontaneous facial expression database for academic emotion inference in online learning. IET Comput Vis 13:329–337. https://doi.org/10.1049/iet-cvi.2018.5281

    Article  Google Scholar 

  43. D’Mello S, Picard RW, Graesser A (2007) Toward an Affect-Sensitive AutoTutor. IEEE Intell Syst 22:53–61. https://doi.org/10.1109/MIS.2007.79

    Article  Google Scholar 

  44. Bosch N, D’mello SK, Ocumpaugh J, Baker RS, Shute V (2016) Using Video to Automatically Detect Learner Affect in Computer-Enabled Classrooms. ACM Trans Interact Intell Syst 6:1–26. https://doi.org/10.1145/2946837

  45. Klein R, Celik T (2017) The Wits Intelligent Teaching System: Detecting student engagement during lectures using convolutional neural networks. In: 2017 IEEE international conference on image processing (ICIP). IEEE, pp 2856–2860

    Chapter  Google Scholar 

  46. Fujii K, Marian P, Clark D, Okamoto Y, Rekimoto J (2018) Sync class: Visualization system for in-class student synchronization. In: Proceedings of the 9th augmented human international conference, pp 1–8

    Google Scholar 

  47. Gupta A, D’Cunha A, Awasthi K, Balasubramanian V (2018) DAiSEE: Towards User Engagement Recognition in the Wild, ArXiv160901885 Cs. http://arxiv.org/abs/1609.01885 (accessed August 10, 2020).

  48. Ashwin TS, Guddeti RMR (2018) Unobtrusive students' engagement analysis in computer science laboratory using deep learning techniques. In: 2018 IEEE 18th International Conference on Advanced Learning Technologies (ICALT). IEEE, pp 436–440

    Google Scholar 

  49. Gupta SK, Ashwin TS, Guddeti RMR (2019) Students’ affective content analysis in smart classroom environment using deep learning techniques. Multimed Tools Appl 78:25321–25348. https://doi.org/10.1007/s11042-019-7651-z

    Article  Google Scholar 

  50. A TS, Guddeti RMR (2020) Affective database for e-learning and classroom environments using Indian students’ faces, hand gestures and body postures. Future Gener Comput Syst 108:334–348. https://doi.org/10.1016/j.future.2020.02.075

  51. Ruiz N, Yu H, Allessio DA, Jalal M, Joshi A, Murray T, Magee JJ, Delgado KM, Ablavsky V, Sclaroff S, Arroyo I, Woolf BP, Bargal SA, Betke M (2022) ATL-BP: A Student Engagement Dataset and Model for Affect Transfer Learning for Behavior Prediction. IEEE Trans Biom Behav Identity Sci 1–1. https://doi.org/10.1109/TBIOM.2022.3210479

  52. Delgado K, Origgi JM, Hasanpoor T, Yu H, Allessio D, Arroyo I, Bargal SA (2021) Student engagement dataset. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp 3628–3636

    Google Scholar 

  53. Savchenko AV, Savchenko LV, Makarov I (2022) Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network. IEEE Trans Affect Comput 13:2132–2143. https://doi.org/10.1109/TAFFC.2022.3188390

    Article  Google Scholar 

  54. Mehta NK, Prasad SS, Saurav S, Saini R, Singh S (2022) Three-dimensional DenseNet self-attention neural network for automatic detection of student’s engagement. Appl Intell. https://doi.org/10.1007/s10489-022-03200-4

    Article  Google Scholar 

  55. Buono P, De Carolis B, D’Errico F, Macchiarulo N, Palestra G (2022) Assessing student engagement from facial behavior in on-line learning. Multimed Tools Appl. https://doi.org/10.1007/s11042-022-14048-8

    Article  Google Scholar 

  56. Liao J, Liang Y, Pan J (2021) Deep facial spatiotemporal network for engagement prediction in online learning. Appl Intell 51:6609–6621. https://doi.org/10.1007/s10489-020-02139-8

    Article  Google Scholar 

  57. Kloft M, Stiehler F, Zheng Z, Pinkwart N (2014) Predicting MOOC dropout over weeks using machine learning methods. In: Proceedings of the EMNLP 2014 workshop on analysis of large scale social interaction in MOOCs, pp 60–65

    Chapter  Google Scholar 

  58. Fwa HL, Marshall L (2018) Modeling engagement of programming students using unsupervised machine learning technique. GSTF Journal on Computing 6(1):1

    Google Scholar 

  59. Kamath A, Biswas A, Balasubramanian V (2016) A crowdsourced approach to student engagement recognition in e-learning environments. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, pp 1–9

    Google Scholar 

  60. Aslan S, Cataltepe Z, Diner I, Dundar O, Esme AA, Ferens R, Yener M (2014) Learner engagement measurement and classification in 1: 1 learning. In: 2014 13th International Conference on Machine Learning and Applications. IEEE, pp 545–552

    Chapter  Google Scholar 

  61. Dewan MAA, Lin F, Wen D, Murshed M, Uddin Z (2018) A deep learning approach to detecting engagement of online learners. In: 2018 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI). IEEE, pp 1895–1902

    Chapter  Google Scholar 

  62. Gupta S, Thakur K, Kumar M (2021) 2D-human face recognition using SIFT and SURF descriptors of face’s feature regions. Vis Comput 37:447–456. https://doi.org/10.1007/s00371-020-01814-8

    Article  Google Scholar 

  63. Pabba C, Kumar P (2022) An intelligent system for monitoring students’ engagement in large classroom teaching through facial expression recognition, Expert Syst 39. https://doi.org/10.1111/exsy.12839

  64. Zhalehpour S, Onder O, Akhtar Z, Erdem CE (2017) BAUM-1: A Spontaneous Audio-Visual Face Database of Affective and Mental States. IEEE Trans Affect Comput 8:300–313. https://doi.org/10.1109/TAFFC.2016.2553038

    Article  Google Scholar 

  65. Abtahi S, Omidyeganeh M, Shirmohammadi S, Hariri B (2020) Yawning detection dataset. In: IEEE dataport. YawDD. https://doi.org/10.21227/e1qm-hb90

    Chapter  Google Scholar 

  66. Chicco D, Jurman G (2020) The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation. BMC Genomics 21:6. https://doi.org/10.1186/s12864-019-6413-7

    Article  Google Scholar 

  67. Dridi N, Hadzagic M (2019) Akaike and Bayesian Information Criteria for Hidden Markov Models. IEEE Signal Process Lett 26:302–306. https://doi.org/10.1109/LSP.2018.2886933

    Article  Google Scholar 

  68. Kaur A, Mustafa A, Mehta L, Dhall A (2018) Prediction and localization of student engagement in the wild. In: 2018 Digital Image Computing: Techniques and Applications (DICTA). IEEE, pp 1–8

    Google Scholar 

  69. Sathayanarayana S, Kumar Satzoda R, Carini A, Lee M, Salamanca L, Reilly J, Littlewort G (2014) Towards automated understanding of student-tutor interactions using visual deictic gestures. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp 474–481

    Google Scholar 

  70. Dhall A, Sharma G, Goecke R, Gedeon T (2020) Emotiw 2020: Driver gaze, group emotion, student engagement and physiological signal based challenges. In: Proceedings of the 2020 International Conference on Multimodal Interaction, pp 784–789

    Chapter  Google Scholar 

  71. Wu J, Yang B, Wang Y, Hattori G (2020) Advanced Multi-Instance Learning Method with Multi-features Engineering and Conservative Optimization for Engagement Intensity Prediction. In: Proc. 2020 Int. Conf. Multimodal Interact., ACM, Virtual Event Netherlands, pp. 777–783. https://doi.org/10.1145/3382507.3417959

  72. Zhu B, Lan X, Guo X, Barner KE, Boncelet C (2020) Multi-rate Attention Based GRU Model for Engagement Prediction. In: Proc. 2020 Int. Conf. Multimodal Interact., ACM, Virtual Event Netherlands, pp. 841–848. https://doi.org/10.1145/3382507.3417965

  73. Wang Y, Kotha A, Hong P, Qiu M (2020) Automated Student Engagement Monitoring and Evaluation during Learning in the Wild. In: 2020 7th IEEE Int. Conf. Cyber Secur. Cloud Comput. CSCloud2020 6th IEEE Int. Conf. Edge Comput. Scalable Cloud EdgeCom, IEEE, New York, NY, USA, pp. 270–275.https://doi.org/10.1109/CSCloud-EdgeCom49738.2020.00054

  74. Geng L, Xu M, Wei Z, Zhou X (2019) Learning Deep Spatiotemporal Feature for Engagement Recognition of Online Courses. In: 2019 IEEE Symp. Ser. Comput. Intell. SSCI, IEEE, Xiamen, China, pp. 442–447.https://doi.org/10.1109/SSCI44817.2019.9002713

  75. Zhang H, Xiao X, Huang T, Liu S, Xia Y, Li J (2019) An Novel End-to-end Network for Automatic Student Engagement Recognition. In: 2019 IEEE 9th Int. Conf. Electron. Inf. Emerg. Commun. ICEIEC, IEEE, Beijing, China, pp. 342–345.https://doi.org/10.1109/ICEIEC.2019.8784507

  76. Murshed M, Dewan MAA, Lin F, Wen D (2019) Engagement Detection in e-Learning Environments using Convolutional Neural Networks. In: 2019 IEEE Intl Conf Dependable Auton. Secure Comput. Intl Conf Pervasive Intell. Comput. Intl Conf Cloud Big Data Comput. Intl Conf Cyber Sci. Technol. Congr. DASCPiComCBDComCyberSciTech, IEEE, Fukuoka, Japan, pp. 80–86.https://doi.org/10.1109/DASC/PiCom/CBDCom/CyberSciTech.2019.00028

  77. Joshi A, Allessio D, Magee J, Whitehill J, Arroyo I, Woolf B, Sclaroff S, Betke M (2019) Affect-driven Learning Outcomes Prediction in Intelligent Tutoring Systems. In: 2019 14th IEEE Int. Conf. Autom. Face Gesture Recognit. FG 2019, IEEE, Lille, France, pp. 1–5.https://doi.org/10.1109/FG.2019.8756624

  78. Wu X, Sahoo D, Hoi SCH (2020) Recent advances in deep learning for object detection. Neurocomputing 396:39–64. https://doi.org/10.1016/j.neucom.2020.01.085

    Article  Google Scholar 

  79. Chen Y, Dai X, Chen D, Liu M, Dong X, Yuan L, Liu Z (2022) Mobile-Former: Bridging MobileNet and Transformer. In: 2022 IEEECVF Conf. Comput. Vis. Pattern Recognit. CVPR, IEEE, New Orleans, LA, USA, pp. 5260–5269.https://doi.org/10.1109/CVPR52688.2022.00520

  80. Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X, Unterthiner T, Dehghani M, Minderer M, Heigold G, Gelly S, Uszkoreit J, Houlsby N (2021) An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. http://arxiv.org/abs/2010.11929 (accessed December 21, 2022).

  81. Arnab A, Dehghani M, Heigold G, Sun C, Lucic M, Schmid C (2021) ViViT: A Video Vision Transformer. In: 2021 IEEECVF Int. Conf. Comput. Vis. ICCV, IEEE, Montreal, QC, Canada, pp. 6816–6826.https://doi.org/10.1109/ICCV48922.2021.00676

  82. Yan S, Xiong X, Arnab A, Lu Z, Zhang M, Sun C, Schmid C (2022) Multiview Transformers for Video Recognition. In: 2022 IEEECVF Conf. Comput. Vis. Pattern Recognit. CVPR, IEEE, New Orleans, LA, USA, pp. 3323–3333.https://doi.org/10.1109/CVPR52688.2022.00333

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kuldeep Singh.

Ethics declarations

Funding and/or Conflicts of interests/Competing interests

The authors have no relevant financial or non-financial interests to disclose. The authors have no competing interests to declare that are relevant to the content of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mandia, S., Mitharwal, R. & Singh, K. Automatic student engagement measurement using machine learning techniques: A literature study of data and methods. Multimed Tools Appl 83, 49641–49672 (2024). https://doi.org/10.1007/s11042-023-17534-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-023-17534-9

Keywords

Navigation