skip to main content
10.1145/3448018.3457997acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper
Open access

Visualizing Prediction Correctness of Eye Tracking Classifiers

Published: 25 May 2021 Publication History

Abstract

Eye tracking data is often used to train machine learning algorithms for classification tasks. The main indicator of performance for such classifiers is typically their prediction accuracy. However, this number does not reveal any information about the specific intrinsic workings of the classifier. In this paper we introduce novel visualization methods which are able to provide such information. We introduce the Prediction Correctness Value (PCV). It is the difference between the calculated probability for the correct class and the maximum calculated probability for any other class. Based on the PCV we present two visualizations:  (1) coloring segments of eye tracking trajectories according to their PCV, thus indicating how beneficial certain parts are towards correct classification, and (2) overlaying similar information for all participants to produce a heatmap that indicates at which places fixations are particularly beneficial towards correct classification. Using these new visualizations we compare the performance of two classifiers (RF and RBFN).

References

[1]
Amina Adadi and Mohammed Berrada. 2018. Peeking inside the black-box: a survey on explainable artificial intelligence (XAI). IEEE access 6(2018), 52138–52160.
[2]
Rocio Alaiz-Rodríguez, Nathalie Japkowicz, and Peter Tischer. 2008. Visualizing classifier performance on different domains. In 2008 20th IEEE International Conference on Tools with Artificial Intelligence, Vol. 2. IEEE, 3–10.
[3]
Alejandro Barredo Arrieta, Natalia Díaz-Rodríguez, Javier Del Ser, Adrien Bennetot, Siham Tabik, Alberto Barbado, Salvador García, Sergio Gil-López, Daniel Molina, Richard Benjamins, 2020. Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI. Information Fusion 58(2020), 82–115.
[4]
Hristo Bakardzhiev, Marloes Burgt, Eduardo Martins, Bart Dool, Chyara Jansen, David Scheppingen, Günter Wallner, and Michael Burch. 2020. A Web-Based Eye Tracking Data Visualization Tool.
[5]
Mattias Nilsson Benfatto, Gustaf ”Oqvist Seimyr, Jan Ygge, Tony Pansell, Agneta Rydberg, and Christer Jacobson. 2016. Screening for dyslexia using eye tracking during reading. PloS one 11, 12 (2016), e0165508.
[6]
Or Biran and Courtenay Cotton. 2017. Explanation and justification in machine learning: A survey. In IJCAI-17 workshop on explainable AI (XAI), Vol. 8. 8–13.
[7]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Michael Burch, Daniel Weiskopf, and Thomas Ertl. 2014. State-of-the-Art of Visualization for Eye Tracking Data. In Eurographics Conference on Visualization, EuroVis 2014 - State of the Art Reports, STARs, Swansea, UK, June 9-13, 2014, Rita Borgo, Ross Maciejewski, and Ivan Viola (Eds.). Eurographics Association. https://doi.org/10.2312/eurovisstar.20141173
[8]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Michael Burch, Daniel Weiskopf, and Thomas Ertl. 2017. Visualization of Eye Tracking Data: A Taxonomy and Survey. Comput. Graph. Forum 36, 8 (2017), 260–284. https://doi.org/10.1111/cgf.13079
[9]
Bokeh Development Team. 2018. Bokeh: Python library for interactive visualization. https://bokeh.pydata.org/en/latest/
[10]
Leo Breiman. 2001. Random Forests. Machine Learning 45, 1 (2001), 5–32. https://doi.org/10.1023/A:1010933404324
[11]
David S. Broomhead and David Lowe. 1988. Multivariable Functional Interpolation and Adaptive Networks. Complex Systems 2, 3 (1988).
[12]
Edwin S. Dalmaijer, Sebastiaan Mathôt, and Stefan Van der Stigchel. 2014. PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior research methods 46, 4 (2014), 913–921.
[13]
Sarah D’Angelo, Jeff Brewer, and Darren Gergle. 2019. Iris: a tool for designing contextually relevant gaze visualizations. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, ETRA 2019, Denver, CO, USA, June 25-28, 2019, Krzysztof Krejtz and Bonita Sharif (Eds.). ACM, 79:1–79:5. https://doi.org/10.1145/3317958.3318228
[14]
Casper J Erkelens and Ingrid MLC Vogels. 1995. The initial direction and landing position of saccades. In Studies in Visual Information Processing. Vol. 6. Elsevier, 133–144.
[15]
Eyetellect. 2016. GazeTracker. http://www.eyetellect.com/gazetracker/
[16]
GAZE INTELLIGENCE. 2020. Blickshift software. https://gazeintelligence.com/blickshift-analytics-1
[17]
Anjith George and Aurobinda Routray. 2016. A score level fusion method for eye movement biometrics. Pattern Recognition Letters 82 (2016), 207–215. https://doi.org/10.1016/j.patrec.2015.11.020
[18]
Pawel Kasprowski and Józef Ober. 2004. Eye Movements in Biometrics. In Biometric Authentication, ECCV 2004 International Workshop, BioAW 2004, Prague, Czech Republic, May 15, 2004, Proceedings. 248–258. https://doi.org/10.1007/978-3-540-25976-3_23
[19]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In The 2014 ACM Conference on Ubiquitous Computing, UbiComp ’14 Adjunct, Seattle, WA, USA - September 13 - 17, 2014, A. J. Brush, Adrian Friday, Julie A. Kientz, James Scott, and Junehwa Song (Eds.). ACM, 1151–1160. https://doi.org/10.1145/2638728.2641695
[20]
Raphael Menges, Sophia Kramer, Stefan Hill, Marius Nisslmueller, Chandan Kumar, and Steffen Staab. 2020. A visualization tool for eye tracking data analysis in the web. In ACM Symposium on Eye Tracking Research and Applications. 1–5.
[21]
Felix Joseph Mercer Moss, Roland Baddeley, and Nishan Canagarajah. 2012. Eye movements to natural images as a function of sex and personality. PLoS One 7, 11 (2012), e47870.
[22]
Anh Nguyen, Jason Yosinski, and Jeff Clune. 2019. Understanding neural networks via feature visualization: A survey. In Explainable AI: interpreting, explaining and visualizing deep learning. Springer, 55–76.
[23]
Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, Peter Prettenhofer, Ron Weiss, Vincent Dubourg, 2011. Scikit-learn: Machine learning in Python. Journal of machine learning research 12, Oct (2011), 2825–2830.
[24]
Ioannis Rigas and Oleg V. Komogortsev. 2017. Current research in eye movement biometrics: An analysis based on BioEye 2015 competition. Image Vision Comput. 58(2017), 129–141. https://doi.org/10.1016/j.imavis.2016.03.014
[25]
Wojciech Samek, Thomas Wiegand, and Klaus-Robert Müller. 2017. Explainable artificial intelligence: Understanding, visualizing and interpreting deep learning models. arXiv preprint arXiv:1708.08296(2017).
[26]
Bahman Abdi Sargezeh, Niloofar Tavakoli, and Mohammad Reza Daliri. 2019. Gender-based eye movement differences in passive indoor picture viewing: An eye-tracking study. Physiology & behavior 206 (2019), 43–50.
[27]
Abraham. Savitzky and M. J. E. Golay. 1964. Smoothing and Differentiation of Data by Simplified Least Squares Procedures. Analytical Chemistry 36, 8 (1964), 1627–1639. https://doi.org/10.1021/ac60214a047
[28]
Ronald W. Schafer. 2011. What Is a Savitzky-Golay Filter? [Lecture Notes]. IEEE Signal Process. Mag. 28, 4 (2011), 111–117. https://doi.org/10.1109/MSP.2011.941097
[29]
Christoph Schröder, Sahar Mahdie Klim Al Zaidawi, Martin H. U. Prinzler, Sebastian Maneth, and Gabriel Zachmann. 2020. Robustness of Eye Movement Biometrics Against Varying Stimuli and Varying Trajectory Length. In CHI ’20: CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, April 25-30, 2020, Regina Bernhaupt, Florian ’Floyd’ Mueller, David Verweij, Josh Andres, Joanna McGrenere, Andy Cockburn, Ignacio Avellino, Alix Goguey, Pernille Bjøn, Shengdong Zhao, Briane Paul Samson, and Rafal Kocielnik (Eds.). ACM, 1–7. https://doi.org/10.1145/3313831.3376534
[30]
Naeem Seliya, Taghi M Khoshgoftaar, and Jason Van Hulse. 2009. A study on the relationships of classifier performance metrics. In 2009 21st IEEE international conference on tools with artificial intelligence. IEEE, 59–66.
[31]
Tayyar Sen and Ted Megaw. 1984. The effects of task variables and prolonged performance on saccadic eye movement parameters. In Advances in Psychology. Vol. 22. Elsevier, 103–111.
[32]
Mina Shojaeizadeh, Soussan Djamasbi, Randy C. Paffenroth, and Andrew C. Trapp. 2019. Detecting task demand via an eye tracking machine learning system. Decis. Support Syst. 116(2019), 91–101. https://doi.org/10.1016/j.dss.2018.10.012
[33]
S.R. Research Ltd.2020. Data Viewer. https://www.sr-research.com/data-viewer/
[34]
Tobii Pro AB. 2014. Tobii Pro Lab. Danderyd, Stockholm. http://www.tobiipro.com/
[35]
Pauli Virtanen, Ralf Gommers, Travis E. Oliphant, Matt Haberland, Tyler Reddy, David Cournapeau, Evgeni Burovski, Pearu Peterson, Warren Weckesser, Jonathan Bright, Stéfan J. van der Walt, Matthew Brett, Joshua Wilson, K. Jarrod Millman, Nikolay Mayorov, Andrew R. J. Nelson, Eric Jones, Robert Kern, Eric Larson, C J Carey, İlhan Polat, Yu Feng, Eric W. Moore, Jake VanderPlas, Denis Laxalde, Josef Perktold, Robert Cimrman, Ian Henriksen, E. A. Quintero, Charles R. Harris, Anne M. Archibald, Antônio H. Ribeiro, Fabian Pedregosa, Paul van Mulbregt, and SciPy 1.0 Contributors. 2020. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods 17(2020), 261–272. https://doi.org/10.1038/s41592-019-0686-2
[36]
Adrian Voßkühler, Volkhard Nordmeier, Lars Kuchinke, and Arthur M Jacobs. 2008. OGAMA (Open Gaze and Mouse Analyzer): open-source software designed to analyze eye and mouse movements in slideshow study designs. Behavior research methods 40, 4 (2008), 1150–1162.
[37]
Sahar Mahdie Klim Al Zaidawi, Martin H. U. Prinzler, Christoph Schröder, Gabriel Zachmann, and Sebastian Maneth. 2020. Gender Classification of Prepubescent Children via Eye Movements with Reading Stimuli. In Companion Publication of the 2020 International Conference on Multimodal Interaction, ICMI Companion 2020, Virtual Event, The Netherlands, October, 2020, Khiet P. Truong, Dirk Heylen, Mary Czerwinski, Nadia Berthouze, Mohamed Chetouani, and Mikio Nakano (Eds.). ACM, 1–6. https://doi.org/10.1145/3395035.3425261

Index Terms

  1. Visualizing Prediction Correctness of Eye Tracking Classifiers
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications
        May 2021
        232 pages
        ISBN:9781450383455
        DOI:10.1145/3448018
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 25 May 2021

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Explainable Artificial Intelligence
        2. Eye Movement Biometrics
        3. Eye Tracking
        4. Gaze Point Visualization
        5. Machine Learning
        6. Prediction Visualization
        7. User Identification;

        Qualifiers

        • Short-paper
        • Research
        • Refereed limited

        Conference

        ETRA '21
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 69 of 137 submissions, 50%

        Upcoming Conference

        ETRA '25

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 410
          Total Downloads
        • Downloads (Last 12 months)68
        • Downloads (Last 6 weeks)18
        Reflects downloads up to 17 Feb 2025

        Other Metrics

        Citations

        View Options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Login options

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media