skip to main content
10.1145/3299815.3314424acmconferencesArticle/Chapter ViewAbstractPublication Pagesacm-seConference Proceedingsconference-collections
research-article

Modeling Students' Attention in the Classroom using Eyetrackers

Published: 18 April 2019 Publication History

Abstract

The process of learning is not merely determined by what the instructor teaches, but also by how the student receives that information. An attentive student will naturally be more open to obtaining knowledge than a bored or frustrated student. In recent years, tools such as skin temperature measurements and body posture calculations have been developed for the purpose of determining a student's affect, or emotional state of mind. However, measuring eye-gaze data is particularly noteworthy in that it can collect measurements non-intrusively, while also being relatively simple to set up and use. This paper details how data obtained from an eye-tracker can indeed be used to predict a student's attention as a measure of affect over the course of a class. From this research, an accuracy of 77% was achieved using the Extreme Gradient Boosting technique of machine learning. The outcome indicates that eye-gaze can be indeed used as a basis for constructing a predictive model.

References

[1]
R. Azevedo, A. Witherspoon, A. Chauncey, C. Burkett, and A. Fike. 2009. MetaTutor: A MetaCognitive Tool for Enhancing Self-Regulated Learning. In 2009 AAAI Fall Symposium Series. Arlington, USA.
[2]
R. Bixler and S. D'Mello. 2016. Automatic Gaze-based User-independent Detection of Mind Wandering during Computerized Reading. User Modeling and User-Adapted Interaction 26, 1 (2016), pp. 33--68.
[3]
C. Conati and C. Merten. 2007. Eye-tracking for User Modeling in Exploratory Learning Environments: An Empirical Evaluation. Knowledge-Based Systems 20, 6 (2007), pp. 557--574.
[4]
S. Craig, A. Graesser, J. Sullins, and B. Gholson. 2004. Affect and Learning: An Exploratory Look into the Role of Affect in Learning with AutoTutor. Journal of educational media 29, 3 (2004), pp. 241--250.
[5]
S D'Mello, A. Olney, C. Williams, and P. Hays. 2012. Gaze Tutor: A Gaze-reactive Intelligent Tutoring System. International Journal of human-computer studies 70, 5 (2012), pp. 377--398.
[6]
A. Graesser, P. Chipman, B. Haynes, and A. Olney. 2005. AutoTutor: An Intelligent Tutoring System with Mixed-initiative Dialogue. IEEE Transactions on Education 48, 4 (2005), pp. 612--618.
[7]
N. Jaques, C. Conati, J. Harley, and R. Azevedo. 2014. Predicting Affect from Gaze Data during Interaction with an Intelligent Tutoring System. In International Conference on Intelligent Tutoring Systems. Springer, Honolulu, USA, pp. 29--38.
[8]
B. Kort, B. Reilly, and R. Picard. 2001. An Affective Model of Interplay between Emotions and Learning: Reengineering Educational Pedagogy - Building A Learning Companion. In Advanced Learning Technologies, 2001. Proceedings. IEEE International Conference on. IEEE, Madison, USA, pp. 43--46.
[9]
B. Lehman, M. Matthews, S. D'Mello, and N. Person. 2008. What Are You Feeling? Investigating Student Affective States During Expert Human Tutoring Sessions. In International Conference on Intelligent Tutoring Systems. Springer, Montreal, Canada, pp. 50--59.
[10]
J. McCambridge, J. Witton, and D. Elbourne. 2014. Systematic Review of the Hawthorne Effect: New Concepts Are Needed to Study Research Participation Effects. Journal of clinical epidemiology 67, 3 (2014), pp. 267--277.
[11]
K. Muldner, R. Christopherson, R. Atkinson, and W. Burleson. 2009. Investigating the Utility of Eye-tracking Information on Affect and Reasoning for User Modeling. In International Conference on User Modeling, Adaptation, and Personalization. Springer, Trento, Italy, pp. 138--149.
[12]
D. Rosengrant, D. Hearrington, K. Alvarado, and D. Keeble. 2012. Following Student Gaze Patterns in Physical Science Lectures. In AIP Conference Proceedings, Vol. 1413. AIP, Philadelphia, USA, pp. 323--326.
[13]
K. Sharma, H. Alavi, P. Jermann, and P. Dillenbourg. 2016. A Gaze-based Learning Analytics Model: In-video Visual Feedback to Improve Learner's Attention in MOOCs. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge. ACM, Edinburgh, UK, pp. 417--421.
[14]
J. Smallwood, D. Fishman, and J. Schooler. 2007. Counting the Cost of an Absent Mind: Mind Wandering as an Underrecognized Influence on Educational Performance. Psychonomic bulletin & review 14, 2 (2007), pp. 230--236.
[15]
J. Zaletelj and A. Košir. 2017. Predicting Students' Attention in the Classroom from Kinect Facial and Body Features. EURASIP Journal on Image and Video Processing 2017, 1 (2017), 80.

Cited By

View all
  • (2024)Leveraging Eye Tracking and Targeted Regions of Interest for Analyzing Code ComprehensionProceedings of the 2024 ACM Southeast Conference10.1145/3603287.3651213(129-137)Online publication date: 18-Apr-2024
  • (2023)Research on Learning Concentration Recognition with Multi-Modal Features in Virtual Reality EnvironmentsSustainability10.3390/su15151160615:15(11606)Online publication date: 27-Jul-2023
  • (2023)An Approach of Analyzing Classroom Student Engagement in Multimodal Environment by Using Deep Learning2023 IEEE 9th International Women in Engineering (WIE) Conference on Electrical and Computer Engineering (WIECON-ECE)10.1109/WIECON-ECE60392.2023.10456488(286-291)Online publication date: 25-Nov-2023
  • Show More Cited By

Index Terms

  1. Modeling Students' Attention in the Classroom using Eyetrackers

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ACMSE '19: Proceedings of the 2019 ACM Southeast Conference
      April 2019
      295 pages
      ISBN:9781450362511
      DOI:10.1145/3299815
      • Conference Chair:
      • Dan Lo,
      • Program Chair:
      • Donghyun Kim,
      • Publications Chair:
      • Eric Gamess
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 18 April 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Affective Computing
      2. Attention
      3. Eyetracking
      4. Machine Learning

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      ACM SE '19
      Sponsor:
      ACM SE '19: 2019 ACM Southeast Conference
      April 18 - 20, 2019
      GA, Kennesaw, USA

      Acceptance Rates

      Overall Acceptance Rate 502 of 1,023 submissions, 49%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)60
      • Downloads (Last 6 weeks)3
      Reflects downloads up to 15 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Leveraging Eye Tracking and Targeted Regions of Interest for Analyzing Code ComprehensionProceedings of the 2024 ACM Southeast Conference10.1145/3603287.3651213(129-137)Online publication date: 18-Apr-2024
      • (2023)Research on Learning Concentration Recognition with Multi-Modal Features in Virtual Reality EnvironmentsSustainability10.3390/su15151160615:15(11606)Online publication date: 27-Jul-2023
      • (2023)An Approach of Analyzing Classroom Student Engagement in Multimodal Environment by Using Deep Learning2023 IEEE 9th International Women in Engineering (WIE) Conference on Electrical and Computer Engineering (WIECON-ECE)10.1109/WIECON-ECE60392.2023.10456488(286-291)Online publication date: 25-Nov-2023
      • (2023)Students’ visual attention during teacher’s talk as a predictor of mathematical achievement: a cautionary taleCogent Psychology10.1080/23311908.2023.221094710:1Online publication date: 10-May-2023
      • (2022)A Review on Different Approaches for Assessing Student Attentiveness in Classroom using Behavioural Elements2022 2nd International Conference on Artificial Intelligence (ICAI)10.1109/ICAI55435.2022.9773418(152-158)Online publication date: 30-Mar-2022
      • (2022)Using AI-based NiCATS System to Evaluate Student Comprehension in Introductory Computer Programming Courses2022 IEEE Frontiers in Education Conference (FIE)10.1109/FIE56618.2022.9962681(1-9)Online publication date: 8-Oct-2022
      • (2022)Development and Field-Testing of a Non-intrusive Classroom Attention Tracking System (NiCATS) for Tracking Student Attention in CS Classrooms2022 IEEE Frontiers in Education Conference (FIE)10.1109/FIE56618.2022.9962447(1-9)Online publication date: 8-Oct-2022
      • (2022)Quantitative measures for classification of human upper body posture in video signal to improve online learningTHE 9TH INTERNATIONAL CONFERENCE OF THE INDONESIAN CHEMICAL SOCIETY ICICS 2021: Toward a Meaningful Society10.1063/5.0100044(020005)Online publication date: 2022
      • (2022)Learner Attention Quantification Using Eye Tracking and EEG SignalsProceedings of the Future Technologies Conference (FTC) 2022, Volume 210.1007/978-3-031-18458-1_57(836-847)Online publication date: 13-Oct-2022
      • (2021)Adaptive intelligent agent for e-learning: First report on enabling technology solutions2021 44th International Convention on Information, Communication and Electronic Technology (MIPRO)10.23919/MIPRO52101.2021.9596869(1690-1694)Online publication date: 27-Sep-2021
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media