skip to main content
10.1145/2857491.2857529acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

On the necessity of adaptive eye movement classification in conditionally automated driving scenarios

Published: 14 March 2016 Publication History

Abstract

Algorithms for eye movement classification are separated into threshold-based and probabilistic methods. While the parameters of static threshold-based algorithms usually need to be chosen for the particular task (task-individual), the probabilistic methods were introduced to meet the challenge of adjusting automatically to multiple individuals with different viewing behaviors (inter-individual). In the context of conditionally automated driving, especially while the driver is performing various secondary tasks, these two requirements of task- and inter-individuality fuse to an even greater challenge. This paper shows how the combination of task- and inter-individual differences influences the viewing behavior of a driver during conditionally automated drives and that state-of-the-art algorithms are not able to sufficiently adapt to these variances. To approach this challenge, an extended version of a Bayesian online learning algorithm is introduced, which is not only able to adapt its parameters to upcoming variances in the viewing behavior, but also has real-time capability and lower computational overhead. The proposed approach is applied to a large-scale driving simulator study with 74 subjects performing secondary tasks while driving in an automated setting. The results show that the eye movement behavior of drivers performing different secondary tasks varies significantly while remaining approximately consistent for idle drivers. Furthermore, the data shows that only a few of the parameters used for describing the eye movement behavior are responsible for these significant variations indicating that it is not necessary to learn all parameters in an online-fashion.

References

[1]
Banerjee, A., Datta, S., Konar, A., Tibarewala, D., and Ramadoss, J. 2014. Cognitive activity recognition based on electrooculogram analysis. In Advanced Computing, Networking and Informatics-Volume 1. Springer, 637--644.
[2]
Barea, R., Boquete, L., Bergasa, L. M., López, E., and Mazo, M. 2003. Electro-oculographic guidance of a wheelchair using eye movements codification. The International Journal of Robotics Research 22, 7-8, 641--652.
[3]
Benjamin, M. 1997. Miller-keane encyclopedia and dictionary of medicine, nursing and allied health. Philadelphia: Saunders.
[4]
Braunagel, C., Kasneci, E., Stolzmann, W., and Rosenstiel, W. 2015. Driver-activity recognition in the context of conditionally autonomous driving. In 18th International IEEE Conference on Intelligent Transportation Systems (ITSC 2015).
[5]
Bulling, A., Ward, J. A., Gellersen, H., and Troster, G. 2011. Eye movement analysis for activity recognition using electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence 33, 4, 741--753.
[6]
Castelhano, M. S., and Henderson, J. M. 2008. Stable individual differences across images in human saccadic eye movements. Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale 62, 1, 1.
[7]
Erkelens, C. J., and Vogels, I. M. 1995. The initial direction and landing position of saccades. Studies in Visual Information Processing 6, 133--144.
[8]
Gandhi, T., Trikha, M., Santhosh, J., and Anand, S. 2010. Development of an expert multitask gadget controlled by voluntary eye movements. Expert Systems with Applications 37, 6, 4204--4211.
[9]
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., and Van de Weijer, J. 2011. Eye tracking: A comprehensive guide to methods and measures. Oxford University Press.
[10]
International, S., 2014. J3016: Taxonomy and definitions for terms related to on-road motor vehicle automated driving systems.
[11]
Isotalo, E., Heikki, A., and Ilmari, P. 2009. Oculomotor findings mimicking a cerebellar disorder and postural control in severe meniere's disease. Auris Nasus Larynx 36, 1, 36--41.
[12]
Kasneci, E., Kasneci, G., Kübler, T. C., and Rosenstiel, W. 2014. The applicability of probabilistic methods to the online recognition of fixations and saccades in dynamic scenes. In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, 323--326.
[13]
Kasneci, E., Kasneci, G., Kübler, T. C., and Rosenstiel, W. 2015. Online recognition of fixations, saccades, and smooth pursuits for automated analysis of traffic hazard perception. In Artificial Neural Networks. Springer, 411--434.
[14]
Komogortsev, O. V., Jayarathna, S., Koh, D. H., and Gowda, S. M. 2010. Qualitative and quantitative scoring and evaluation of the eye movement classification algorithms. In Proceedings of the 2010 Symposium on eye-tracking research & applications, ACM, 65--68.
[15]
Leigh, R. J., and Zee, D. S. 2015. The neurology of eye movements. Oxford University Press.
[16]
Rayner, K. 1998. Eye movements in reading and information processing: 20 years of research. Psychological bulletin 124, 3, 372.
[17]
Rötting, M. 2001. Parametersystematik der Augen-und Blickbewegungen für arbeitswissenschaftliche Untersuchungen. Shaker.
[18]
Salvucci, D. D., and Anderson, J. R. 1998. Tracing eye movement protocols with cognitive process models.
[19]
Salvucci, D. D., and Goldberg, J. H. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications, ACM, 71--78.
[20]
Sen, T., and Megaw, T. 1984. The effects of task variables and prolonged performance on saccadic eye movement parameters. Advances in Psychology 22, 103--111.
[21]
Tafaj, E., Kasneci, G., Rosenstiel, W., and Bogdan, M. 2012. Bayesian online clustering of eye movement data. In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, 285--288.
[22]
Winn, J. M., and Bishop, C. M. 2005. Variational message passing. In Journal of Machine Learning Research, 661--694.

Cited By

View all
  • (2024)DLEmotion: Deep learning-based emotion classification using visual attention location informationBiomedical Signal Processing and Control10.1016/j.bspc.2024.10644995(106449)Online publication date: Sep-2024
  • (2023)Eye Tracking, Usability, and User Experience: A Systematic ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2023.222160040:17(4484-4500)Online publication date: 18-Jun-2023
  • (2022)RETRACTED ARTICLE: Eye tracking: empirical foundations for a minimal reporting guidelineBehavior Research Methods10.3758/s13428-021-01762-855:1(364-416)Online publication date: 6-Apr-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications
March 2016
378 pages
ISBN:9781450341257
DOI:10.1145/2857491
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 March 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. automated analysis methods
  2. eye movements and cognition
  3. machine learning methods and algorithms

Qualifiers

  • Research-article

Conference

ETRA '16
ETRA '16: 2016 Symposium on Eye Tracking Research and Applications
March 14 - 17, 2016
South Carolina, Charleston

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)2
Reflects downloads up to 18 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)DLEmotion: Deep learning-based emotion classification using visual attention location informationBiomedical Signal Processing and Control10.1016/j.bspc.2024.10644995(106449)Online publication date: Sep-2024
  • (2023)Eye Tracking, Usability, and User Experience: A Systematic ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2023.222160040:17(4484-4500)Online publication date: 18-Jun-2023
  • (2022)RETRACTED ARTICLE: Eye tracking: empirical foundations for a minimal reporting guidelineBehavior Research Methods10.3758/s13428-021-01762-855:1(364-416)Online publication date: 6-Apr-2022
  • (2022)Review and Evaluation of Eye Movement Event Detection AlgorithmsSensors10.3390/s2222881022:22(8810)Online publication date: 15-Nov-2022
  • (2022)Gaze-based Object Detection in the Wild2022 Sixth IEEE International Conference on Robotic Computing (IRC)10.1109/IRC55401.2022.00017(62-66)Online publication date: Dec-2022
  • (2021)Classification of Eye Movement and Its Application in Driving Based on a Refined Pre-Processing and Machine Learning AlgorithmIEEE Access10.1109/ACCESS.2021.31159619(136164-136181)Online publication date: 2021
  • (2020)MLGaze: Machine Learning-Based Analysis of Gaze Error Patterns in Consumer Eye Tracking SystemsVision10.3390/vision40200254:2(25)Online publication date: 7-May-2020
  • (2020)Gaze-Head Input: Examining Potential Interaction with Immediate Experience Sampling in an Autonomous VehicleApplied Sciences10.3390/app1024901110:24(9011)Online publication date: 17-Dec-2020
  • (2018)Measuring the Spatial Noise of a Low-Cost Eye Tracker to Enhance Fixation DetectionJournal of Imaging10.3390/jimaging40800964:8(96)Online publication date: 28-Jul-2018
  • (2017)Online Recognition of Driver-Activity Based on Visual Scanpath ClassificationIEEE Intelligent Transportation Systems Magazine10.1109/MITS.2017.27431719:4(23-36)Online publication date: Dec-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media