skip to main content
research-article

Feature Weighted Linguistics Classifier for Predicting Learning Difficulty Using Eye Tracking

Published:18 May 2020Publication History
Skip Abstract Section

Abstract

This article presents a new approach to predict learning difficulty in applications such as e-learning using eye movement and pupil response. We have developed 12 eye response features based on psycholinguistics, contextual information processing, anticipatory behavior analysis, recurrence fixation analysis, and pupillary response. A key aspect of the proposed approach is the temporal analysis of the feature response to the same concept. Results show that variations in eye response to the same concept over time are indicative of learning difficulty. A Feature Weighted Linguistics Classifier (FWLC) was developed to predict learning difficulty in real time. The proposed approach predicts learning difficulty with an accuracy of 90%.

References

  1. Saurin Parikh and Hari Kalva. 2018. Predicting learning difficulty based on gaze and pupil response. In Proceedings of the 26th Conference on User Modeling, Adaptation and Personalization Adjunct (UMAP’18 Adjunct). 5. DOI:10.1145/3213586.3226224Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. C. Conati and C. Merten. 2007. Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation. Knowl.-based Syst. 20, 6 (2007), 557--574.Google ScholarGoogle Scholar
  3. V. Manuel et al. 2004. AdeLE: A framework for adaptive e-learning through eye tracking. In Proceedings of the International Conference on Knowledge Management and Knowledge Technologies (I-KNOW’04). 609--616.Google ScholarGoogle Scholar
  4. C. Calvi, M. Porta, and D. Sacchi. 2008. e5Learning, an e-learning environment based on eye tracking. In Proceedings of the 8th IEEE International Conference on Advanced Learning Technologies. 376--380.Google ScholarGoogle Scholar
  5. K. Rayner. 2009. Eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol. 62, 8 (2009), 1457--1506.Google ScholarGoogle ScholarCross RefCross Ref
  6. L. Copeland and T. Gedeon. 2013. The effect of subject familiarity on comprehension and eye movements during reading. In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration. 285--288.Google ScholarGoogle Scholar
  7. K. Rayner, K. H. Chace, T. J. Slattery, and J. Ashby. 2006. Eye movements as reflections of comprehension processes in reading. Sci. Stud. Read. 10, 3 (2006), 241--255.Google ScholarGoogle ScholarCross RefCross Ref
  8. M. Porta, S. Ricotti, and C. J. Perez. 2012. Emotional e-learning through eye tracking. In Proceedings of the IEEE Global Engineering Education Conference (EDUCON’12). 1--6.Google ScholarGoogle Scholar
  9. V. Cantoni, C. J. Perez, M. Porta, and S. Ricotti. 2012. Exploiting eye tracking in advanced e-learning systems. In Proceedings of the 13th International Conference on Computer Systems and Technologies. 376--383.Google ScholarGoogle Scholar
  10. H. Wang, M. Chignell, and M. Ishizuka. 2006. Empathic tutoring software agents using real-time eye tracking. In Proceedings of the Symposium on Eye Tracking Research 8 Applications. 73--78.Google ScholarGoogle Scholar
  11. A. Vagge, M. Cavanna, C. E. Traverso, and M. Iester. 2015. Evaluation of ocular movements in patients with dyslexia. Ann. Dyslexia 65, 1 (2015), 24--32.Google ScholarGoogle ScholarCross RefCross Ref
  12. L. Copeland and T. Gedeon. 2016. Tutorials in eLearning: How presentation affects outcomes. IEEE Trans. Emerg. Top. Comput. 99, 1--1.Google ScholarGoogle Scholar
  13. T. Vo, B. S. U. Mendis, and T. Gedeon. 2010. Gaze pattern and reading comprehension. In Neural Information Processing. Models and Applications. 124--131. Springer.Google ScholarGoogle Scholar
  14. J. Gwizdka, R. Hosseini, M. Cole, and S. Wang. 2017. Temporal dynamics of eye-tracking and EEG during reading and relevance decisions. J. Assoc. Inf. Sci. Technol. 68, 10 (2017), 2299--2312.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. R. K. Sungkur, M. A. Antoaroo, and A. Beeharry. 2016. Eye tracking system for enhanced learning experiences. Educ. Inf. Technol. 21, 6 (2016), 1785--1806.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Z. Zhan, L. Zhang, M. Hu, and P. S. W. Fong. 2016. Online learners’ reading ability detection based on eye-tracking sensors. Sens. Basel. 16, 9 (2016), 1457.Google ScholarGoogle ScholarCross RefCross Ref
  17. L. Copeland, T. Gedeon, and S. Caldwell. 2015. Effects of text difficulty and readers on predicting reading comprehension from eye movements. In Proceedings of the 6th IEEE International Conference on Cognitive Infocommunications (CogInfoCom’15). 407--412.Google ScholarGoogle Scholar
  18. L. Copeland and T. Gedeon. 2013. Measuring reading comprehension using eye movements. In Proceedings of the IEEE 4th International Conference on Cognitive Infocommunications (CogInfoCom’13). 791--796.Google ScholarGoogle Scholar
  19. L. Rello and R. Baeza-Yates. 2017. How to present more readable text for people with dyslexia. Univers. Access Inf. Soc. 16, 1 (2017), 29--49.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. J. L. Sibert, M. Gokturk, and R. A. Lavine. 2000. The reading assistant: Eye gaze triggered auditory prompting for reading remediation. In Proceedings of the 13th ACM Symposium on User Interface Software and Technology. 101--107.Google ScholarGoogle Scholar
  21. R. Chaffin, R. K. Morris, and R. E. Seely. 2001. Learning new word meanings from context: A study of eye movements. J. Exp. Psychol. Learn. Mem. Cogn. 27, 1 (2001), 225--235.Google ScholarGoogle ScholarCross RefCross Ref
  22. L. Frazier and K. Rayner. 1982. Making and correcting errors during sentence comprehension: Eye movements in the analysis of structurally ambiguous sentences. Cogn. Psychol. 14, 2 (1982), 178--210.Google ScholarGoogle ScholarCross RefCross Ref
  23. F. Huettig and S. Brouwer. 2015. Delayed anticipatory spoken language processing in adults with dyslexia—evidence from eye-tracking. Dyslexia 21, 2 (2015), 97--122.Google ScholarGoogle ScholarCross RefCross Ref
  24. K. Rayner and S. A. Duffy. 1986. Lexical complexity and fixation times in reading: Effects of word frequency, verb complexity, and lexical ambiguity. Mem. Cogn. 14, 3 (1986), 191--201.Google ScholarGoogle ScholarCross RefCross Ref
  25. Davies and Mark. 2008. The corpus of contemporary American English (COCA): 600 million words, 1990-present. https://www.english-corpora.org/coca/.Google ScholarGoogle Scholar
  26. R. Williams and R. Morris. 2004. Eye movements, word familiarity, and vocabulary acquisition. Eur. J. Cogn. Psychol. 16, 1--2 (2004), 312--339.Google ScholarGoogle ScholarCross RefCross Ref
  27. K. Rayner and G. W. McConkie. 1976. What guides a reader's eye movements? Vision Res. 16, 8 (1976), 829--837.Google ScholarGoogle ScholarCross RefCross Ref
  28. K. Rayner, S. C. Sereno, and G. E. Raney. 1996. Eye movement control in reading: A comparison of two types of models. J. Exp. Psychol. Hum. Percept. Perform. 22, 5 (1996), 1188.Google ScholarGoogle ScholarCross RefCross Ref
  29. C. Sibley, J. Coyne, and C. Baldwin. 2011. Pupil dilation as an index of learning. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 55, 1 (2011), 237--241.Google ScholarGoogle Scholar
  30. G. J. Siegle, S. R. Steinhauer, V. A. Stenger, R. Konecky, and C. S. Carter. 2003. Use of concurrent pupil dilation assessment to inform interpretation and analysis of fMRI data. NeuroImage 20, 1 (2003), 114--124.Google ScholarGoogle ScholarCross RefCross Ref
  31. Z. Dobesova and M. Malcik. 2015. Workflow diagrams and pupil dilatation in eye-tracking testing. In Proceedings of the 13th International Conference on Emerging eLearning Technologies and Applications (ICETA’15). 1--6.Google ScholarGoogle Scholar
  32. S. Husain, S. Vasishth, and N. Srinivasan. 2015. Integration and prediction difficulty in Hindi sentence comprehension: Evidence from an eye-tracking corpus. J. Eye Movem. Res. 8(2):3 (2015), 1--12.Google ScholarGoogle Scholar
  33. M. J. Traxler. 2014. Trends in syntactic parsing: Anticipation, Bayesian estimation, and good-enough parsing. Trends Cogn. Sci. 18, 11 (2014), 605--611.Google ScholarGoogle ScholarCross RefCross Ref
  34. N. C. Anderson, W. F. Bischof, K. E. W. Laidlaw, E. F. Risko, and A. Kingstone. 2013. Recurrence quantification analysis of eye movements. Behav. Res. Meth. 45, 3 (2013), 842--856.Google ScholarGoogle ScholarCross RefCross Ref
  35. A. Voßkühler, V. Nordmeier, L. Kuchinke, and A. M. Jacobs. 2008. OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs. Behav. Res. Meth. 40, 4 (2008), 1150--1162.Google ScholarGoogle ScholarCross RefCross Ref
  36. R. Smith. 2007. An overview of the tesseract OCR engine. In Proceedings of the 9th International Conference on Document Analysis and Recognition. 629--633.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Feature Weighted Linguistics Classifier for Predicting Learning Difficulty Using Eye Tracking

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image ACM Transactions on Applied Perception
        ACM Transactions on Applied Perception  Volume 17, Issue 2
        April 2020
        82 pages
        ISSN:1544-3558
        EISSN:1544-3965
        DOI:10.1145/3399405
        Issue’s Table of Contents

        Copyright © 2020 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 18 May 2020
        • Online AM: 7 May 2020
        • Revised: 1 January 2020
        • Accepted: 1 January 2020
        • Received: 1 October 2017
        Published in tap Volume 17, Issue 2

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format