Abstract
This article presents a new approach to predict learning difficulty in applications such as e-learning using eye movement and pupil response. We have developed 12 eye response features based on psycholinguistics, contextual information processing, anticipatory behavior analysis, recurrence fixation analysis, and pupillary response. A key aspect of the proposed approach is the temporal analysis of the feature response to the same concept. Results show that variations in eye response to the same concept over time are indicative of learning difficulty. A Feature Weighted Linguistics Classifier (FWLC) was developed to predict learning difficulty in real time. The proposed approach predicts learning difficulty with an accuracy of 90%.
- Saurin Parikh and Hari Kalva. 2018. Predicting learning difficulty based on gaze and pupil response. In Proceedings of the 26th Conference on User Modeling, Adaptation and Personalization Adjunct (UMAP’18 Adjunct). 5. DOI:10.1145/3213586.3226224Google ScholarDigital Library
- C. Conati and C. Merten. 2007. Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation. Knowl.-based Syst. 20, 6 (2007), 557--574.Google Scholar
- V. Manuel et al. 2004. AdeLE: A framework for adaptive e-learning through eye tracking. In Proceedings of the International Conference on Knowledge Management and Knowledge Technologies (I-KNOW’04). 609--616.Google Scholar
- C. Calvi, M. Porta, and D. Sacchi. 2008. e5Learning, an e-learning environment based on eye tracking. In Proceedings of the 8th IEEE International Conference on Advanced Learning Technologies. 376--380.Google Scholar
- K. Rayner. 2009. Eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol. 62, 8 (2009), 1457--1506.Google ScholarCross Ref
- L. Copeland and T. Gedeon. 2013. The effect of subject familiarity on comprehension and eye movements during reading. In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration. 285--288.Google Scholar
- K. Rayner, K. H. Chace, T. J. Slattery, and J. Ashby. 2006. Eye movements as reflections of comprehension processes in reading. Sci. Stud. Read. 10, 3 (2006), 241--255.Google ScholarCross Ref
- M. Porta, S. Ricotti, and C. J. Perez. 2012. Emotional e-learning through eye tracking. In Proceedings of the IEEE Global Engineering Education Conference (EDUCON’12). 1--6.Google Scholar
- V. Cantoni, C. J. Perez, M. Porta, and S. Ricotti. 2012. Exploiting eye tracking in advanced e-learning systems. In Proceedings of the 13th International Conference on Computer Systems and Technologies. 376--383.Google Scholar
- H. Wang, M. Chignell, and M. Ishizuka. 2006. Empathic tutoring software agents using real-time eye tracking. In Proceedings of the Symposium on Eye Tracking Research 8 Applications. 73--78.Google Scholar
- A. Vagge, M. Cavanna, C. E. Traverso, and M. Iester. 2015. Evaluation of ocular movements in patients with dyslexia. Ann. Dyslexia 65, 1 (2015), 24--32.Google ScholarCross Ref
- L. Copeland and T. Gedeon. 2016. Tutorials in eLearning: How presentation affects outcomes. IEEE Trans. Emerg. Top. Comput. 99, 1--1.Google Scholar
- T. Vo, B. S. U. Mendis, and T. Gedeon. 2010. Gaze pattern and reading comprehension. In Neural Information Processing. Models and Applications. 124--131. Springer.Google Scholar
- J. Gwizdka, R. Hosseini, M. Cole, and S. Wang. 2017. Temporal dynamics of eye-tracking and EEG during reading and relevance decisions. J. Assoc. Inf. Sci. Technol. 68, 10 (2017), 2299--2312.Google ScholarDigital Library
- R. K. Sungkur, M. A. Antoaroo, and A. Beeharry. 2016. Eye tracking system for enhanced learning experiences. Educ. Inf. Technol. 21, 6 (2016), 1785--1806.Google ScholarDigital Library
- Z. Zhan, L. Zhang, M. Hu, and P. S. W. Fong. 2016. Online learners’ reading ability detection based on eye-tracking sensors. Sens. Basel. 16, 9 (2016), 1457.Google ScholarCross Ref
- L. Copeland, T. Gedeon, and S. Caldwell. 2015. Effects of text difficulty and readers on predicting reading comprehension from eye movements. In Proceedings of the 6th IEEE International Conference on Cognitive Infocommunications (CogInfoCom’15). 407--412.Google Scholar
- L. Copeland and T. Gedeon. 2013. Measuring reading comprehension using eye movements. In Proceedings of the IEEE 4th International Conference on Cognitive Infocommunications (CogInfoCom’13). 791--796.Google Scholar
- L. Rello and R. Baeza-Yates. 2017. How to present more readable text for people with dyslexia. Univers. Access Inf. Soc. 16, 1 (2017), 29--49.Google ScholarDigital Library
- J. L. Sibert, M. Gokturk, and R. A. Lavine. 2000. The reading assistant: Eye gaze triggered auditory prompting for reading remediation. In Proceedings of the 13th ACM Symposium on User Interface Software and Technology. 101--107.Google Scholar
- R. Chaffin, R. K. Morris, and R. E. Seely. 2001. Learning new word meanings from context: A study of eye movements. J. Exp. Psychol. Learn. Mem. Cogn. 27, 1 (2001), 225--235.Google ScholarCross Ref
- L. Frazier and K. Rayner. 1982. Making and correcting errors during sentence comprehension: Eye movements in the analysis of structurally ambiguous sentences. Cogn. Psychol. 14, 2 (1982), 178--210.Google ScholarCross Ref
- F. Huettig and S. Brouwer. 2015. Delayed anticipatory spoken language processing in adults with dyslexia—evidence from eye-tracking. Dyslexia 21, 2 (2015), 97--122.Google ScholarCross Ref
- K. Rayner and S. A. Duffy. 1986. Lexical complexity and fixation times in reading: Effects of word frequency, verb complexity, and lexical ambiguity. Mem. Cogn. 14, 3 (1986), 191--201.Google ScholarCross Ref
- Davies and Mark. 2008. The corpus of contemporary American English (COCA): 600 million words, 1990-present. https://www.english-corpora.org/coca/.Google Scholar
- R. Williams and R. Morris. 2004. Eye movements, word familiarity, and vocabulary acquisition. Eur. J. Cogn. Psychol. 16, 1--2 (2004), 312--339.Google ScholarCross Ref
- K. Rayner and G. W. McConkie. 1976. What guides a reader's eye movements? Vision Res. 16, 8 (1976), 829--837.Google ScholarCross Ref
- K. Rayner, S. C. Sereno, and G. E. Raney. 1996. Eye movement control in reading: A comparison of two types of models. J. Exp. Psychol. Hum. Percept. Perform. 22, 5 (1996), 1188.Google ScholarCross Ref
- C. Sibley, J. Coyne, and C. Baldwin. 2011. Pupil dilation as an index of learning. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 55, 1 (2011), 237--241.Google Scholar
- G. J. Siegle, S. R. Steinhauer, V. A. Stenger, R. Konecky, and C. S. Carter. 2003. Use of concurrent pupil dilation assessment to inform interpretation and analysis of fMRI data. NeuroImage 20, 1 (2003), 114--124.Google ScholarCross Ref
- Z. Dobesova and M. Malcik. 2015. Workflow diagrams and pupil dilatation in eye-tracking testing. In Proceedings of the 13th International Conference on Emerging eLearning Technologies and Applications (ICETA’15). 1--6.Google Scholar
- S. Husain, S. Vasishth, and N. Srinivasan. 2015. Integration and prediction difficulty in Hindi sentence comprehension: Evidence from an eye-tracking corpus. J. Eye Movem. Res. 8(2):3 (2015), 1--12.Google Scholar
- M. J. Traxler. 2014. Trends in syntactic parsing: Anticipation, Bayesian estimation, and good-enough parsing. Trends Cogn. Sci. 18, 11 (2014), 605--611.Google ScholarCross Ref
- N. C. Anderson, W. F. Bischof, K. E. W. Laidlaw, E. F. Risko, and A. Kingstone. 2013. Recurrence quantification analysis of eye movements. Behav. Res. Meth. 45, 3 (2013), 842--856.Google ScholarCross Ref
- A. Voßkühler, V. Nordmeier, L. Kuchinke, and A. M. Jacobs. 2008. OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs. Behav. Res. Meth. 40, 4 (2008), 1150--1162.Google ScholarCross Ref
- R. Smith. 2007. An overview of the tesseract OCR engine. In Proceedings of the 9th International Conference on Document Analysis and Recognition. 629--633.Google ScholarCross Ref
Index Terms
- Feature Weighted Linguistics Classifier for Predicting Learning Difficulty Using Eye Tracking
Recommendations
Predicting Learning Difficulty Based on Gaze and Pupil Response
UMAP '18: Adjunct Publication of the 26th Conference on User Modeling, Adaptation and PersonalizationE-Learning is transforming the way education is imparted. Today, millions of students take self-paced online courses. However, the content and language complexity often hinders comprehension and this together with lack of immediate help from the course ...
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Eye detection and coarse localization of pupil for video-based eye tracking systems
AbstractA video-based eye tracking system generally captures NIR images, each of which contains one or two eyes of a subject. The subject’s point of gaze is then determined using 3D eye model and pupil centre corneal reflection technique. Eye detection ...
Comments