Abstract
Inference about high-level cognitive states during interaction is a fundamental task in building proactive intelligent systems that would allow effective offloading of mental operations to a computational architecture. We introduce an improved machine-learning pipeline able to predict user interactive behavior and performance using real-time eye-tracking. The inference is carried out using a support-vector machine (SVM) on a large set of features computed from eye movement data that are linked to concurrent high-level behavioral codes based on think aloud protocols. The differences between cognitive states can be inferred from overt visual attention patterns with accuracy over chance levels, although the overall accuracy is still low. The system can also classify and predict performance of the problem-solving users with up to 79 % accuracy. We suggest this prediction model as a universal approach for understanding of gaze in complex strategic behavior. The findings confirm that eye movement data carry important information about problem solving processes and that proactive systems can benefit from real-time monitoring of visual attention.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Anderson JR, Bothell D, Douglass S (2004) Eye movements do not reflect retrieval: limits of the eye-mind hypothesis. Psychol Sci 15:225–231
Bailey BP, Iqbal ST (2008) Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management. ACM Trans Comput-Hum Interact 14(4):1–28
Bednarik R (2005) Potentials of eye-movement tracking in adaptive systems. In: Proceedings of the fourth workshop on the evaluation of adaptive systems, held in conjunction with the 10th international conference on user modeling (UM’05), pp 1–8
Bednarik R, Tukiainen M (2008) Temporal eye-tracking data: evolution of debugging strategies with multiple representations. In: Proceedings of the 2008 symposium on eye tracking research & applications. ACM, New York, pp 99–102
Bednarik R, Myller N, Sutinen E, Tukiainen M (2006) Analyzing individual differences in program comprehension. Technol Instr Cogn Learn 3(3/4):205
Bednarik R, Gowases T, Tukiainen M (2009) Gaze interaction enhances problem solving: effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. J Eye Movement Res 3(1):1–10
Bednarik R, Vrzakova H, Hradis M (2012) What you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the 2012 symposium on eye-tracking research & applications, ETRA’12. ACM, New York
Chang CC, Lin CJ (2011) LibSVM: a library for support vector machines. Science 2(3):1–39
Conati C, Merten C (2007) Eye-tracking for user modeling in exploratory learning environments: an empirical evaluation. Knowl-Based Syst 20:557–574
Davies SP (2003) Initial and concurrent planning in solutions to well-structured problems. Q J Exp Psychol, A Hum Exp Psychol 56(7):1147–1164
Eivazi S, Bednarik R (2010) Inferring problem solving strategies using eye-tracking: system description and evaluation. In: Proceedings of the 10th Koli Calling international conference on computing education research, Koli Calling’10. ACM, New York, pp 55–61
Eivazi S, Bednarik R (2011) Predicting problem-solving behavior and performance levels from visual attention data. In: Proceedings of 2nd workshop on eye gaze in intelligent human machine interaction at IUI, pp 9–16
Ericcson KA (1975) Instruction to verbalize as a means to study problem solving process with the 8-puzzle: a preliminary study. Department of Psychology, University of Stockholm
Ericsson KA, Simon HA (1993) Protocol analysis: verbal reports as data revised edition. MIT Press, Cambridge
Glöckner A, Herbold AK (2010) An eye-tracking study on information processing in risky decisions: evidence for compensatory strategies based on automatic processes. J Behav Decis Mak 41(1):71–98
Goldberg JH, Kotval XP (1999) Computer interface evaluation using eye movements: methods and constructs. Int J Ind Ergon 24:631–645
Graf ABA, Borer S (2001) Normalization in support vector machines. In: Proceedings of the 23rd DAGM-symposium on pattern recognition. Springer, London, pp 277–282
Hsu CW, Chang CC, Lin CJ (2003) A practical guide to support vector classification. Technical report, National Taiwan University
Ishii R, Nakano YI (2008) Estimating user’s conversational engagement based on gaze behaviors. In: Proceedings of the 8th international conference on intelligent virtual agents (IVA’08), pp 200–207
Just MA, Carpenter PA (1976) Eye fixations and cognitive processes. J Cogn Psychol 8:441–480
Kaller CP, Rahm B, Bolkenius K, Unterrainer JM (2009) Eye movements and visuospatial problem solving: identifying separable phases of complex cognition. Psychophysiology 46:818–830
Liang Y, Reyes ML, Lee JD (2007) Real-time detection of driver cognitive distraction using support vector machines. IEEE Trans Intell Transp Syst 8:340–350
Lipps M, Pelz JB (2004) Yarbus revisited: task-dependent oculomotor behavior. J Vis 4(8):115
Liu Y, Hsueh PY, Lai J, Sangin M, Nüssli MA, Dillenbourg P (2009) Who is the expert? Analyzing gaze data to predict expertise level in collaborative applications. In: Proceedings of the 2009 IEEE international conference on multimedia and expo
Loboda TD, Brusilovsky P (2010) User-adaptive explanatory program visualization: evaluation and insights from eye movements. User Model User-Adapt Interact 20:191–226
Meyer D, Leischa F, Hornikb K (2003) The support vector machine under test. Neurocomputing 55:169–186
Morgan PL, Waldron SM, King SL, Patrick J (2007) Harder to access, better performance? The effects of information access cost on strategy and performance. In: Proceedings of the 2007 conference on human interface: part I. Springer, Berlin, pp 115–125
O’Hara KP, Payne SJ (1998) The effects of operator implementation cost on planfulness of problem solving and learning. Cogn Psychol 35:34–70
Rayner K (1998) Eye movements in reading and information processing: 20 years of research. Psychol Bull 124(3):372–422
Salvucci DD (2001) An integrated model of eye movements and visual encoding. J Cogn Syst 1(4):201–220
Salvucci DD, Goldberg JH (2000) Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 symposium on eye tracking research & applications, ETRA’00. ACM, New York, pp 71–78
Simola J, Salojärvi J, Kojo I (2008) Using hidden Markov model to uncover processing states from eye movements in information search tasks. Cogn Syst Res 9(4):237–251
Smith JD, Graham TCN (2006) Use of eye movements for video game control. In: ACM advancements in computer entertainment technology (ACE’06). ACM, New York, article no. 20
Surakka V, Illi M, Isokoski P (2003) Voluntary eye movements in human-computer interaction. North-Holland, Amsterdam, p 471 (Chap 22)
van Someren MW, Barnard YF, Sandberg JAC (1994) The think aloud method: a practical guide to modelling cognitive processes. Academic Press, San Diego
Velichkovsky BM (1999) From levels of processing to stratification of cognition: converging evidence from three domains of research. Benjamins, Amsterdam
Vidal M, Bulling A, Gellersen H (2011) Analysing EOG signal features for the discrimination of eye movements with wearable devices. In: Proceedings of the 1st international workshop on pervasive eye tracking and mobile eye-based interaction, PETMEI’11. ACM, New York, pp 15–20
Vrochidis S, Patras I, Kompatsiaris I (2011) An eye-tracking-based approach to facilitate interactive video search. In: Proceedings of the 1st ACM international conference on multimedia retrieval, ICMR’11. ACM, New York, pp 43:1–43:8
Xu S, Jiang H, Lau FC (2008) Personalized online document, image and video recommendation via commodity eye-tracking. In: Proceedings of the 2008 ACM conference on recommender systems, RecSys’08. ACM, New York, pp 83–90
Xu S, Jiang H, Lau FC (2009) User-oriented document summarization through vision-based eye-tracking. In: Proceedings of the 14th international conference on intelligent user interfaces, IUI’09. ACM, New York, pp 7–16
Yarbus AL (1967) Eye movements during perception of complex objects. Plenum, New York, pp 171–196 (Chap VII)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag London
About this chapter
Cite this chapter
Bednarik, R., Eivazi, S., Vrzakova, H. (2013). A Computational Approach for Prediction of Problem-Solving Behavior Using Support Vector Machines and Eye-Tracking Data. In: Nakano, Y., Conati, C., Bader, T. (eds) Eye Gaze in Intelligent User Interfaces. Springer, London. https://doi.org/10.1007/978-1-4471-4784-8_7
Download citation
DOI: https://doi.org/10.1007/978-1-4471-4784-8_7
Publisher Name: Springer, London
Print ISBN: 978-1-4471-4783-1
Online ISBN: 978-1-4471-4784-8
eBook Packages: Computer ScienceComputer Science (R0)