ABSTRACT
In this paper we showcase several eye movement data visualizations and how they can be interactively linked to design a flexible visualization tool for eye movement data. The aim of this project is to create a user-friendly and easy accessible tool to interpret visual attention patterns and to facilitate data analysis for eye movement data. Hence, to increase accessibility and usability we provide a web-based solution. Users can upload their own eye movement data set and inspect it from several perspectives simultaneously. Insights can be shared and collaboratively be discussed with others. The currently available visualization techniques are a 2D density plot, a scanpath representation, a bee swarm, and a scarf plot, all supporting several standard interaction techniques. Moreover, due to the linking feature, users can select data in one visualization, and the same data points will be highlighted in all active visualizations for solving comparison tasks. The tool also provides functions that make it possible to upload both, private or public data sets, and can generate URLs to share the data and settings of customized visualizations. A user study showed that the tool is understandable and that providing linked customizable views is beneficial for analyzing eye movement data.
- Hristo Bakardzhiev, Marloes van der Burgt, Eduardo Martins, Bart van den Dool, Chyara Jansen, David van Scheppingen, Günter Wallner, and Michael Burch. 2020. A Web-Based Eye Tracking Data Visualization Tool. In Proceedings of Pattern Recognition. ICPR International Workshops and Challenges - Virtual Event, Part III(Lecture Notes in Computer Science, Vol. 12663), Alberto Del Bimbo, Rita Cucchiara, Stan Sclaroff, Giovanni Maria Farinella, Tao Mei, Marco Bertini, Hugo Jair Escalante, and Roberto Vezzani (Eds.). Springer, 405–419.Google Scholar
- Tanja Blascheck, Michael Burch, Michael Raschke, and Daniel Weiskopf. 2015. Challenges and Perspectives in Big Eye-Movement Data Visual Analytics. In Proceedings of the Symposium on Big Data Visual Analytics, BDVA. IEEE, Washington, DC, USA, 17–24.Google ScholarCross Ref
- Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Michael Burch, Daniel Weiskopf, and Thomas Ertl. 2017. Visualization of Eye Tracking Data: A Taxonomy and Survey. Computer Graphics Forum 36, 8 (2017), 260–284.Google ScholarCross Ref
- Agnieszka Bojko. 2009. Informative or Misleading? Heatmaps Deconstructed. In Proceedings of the Conference on Human-Computer Interaction, HCI(Lecture Notes in Computer Science, Vol. 5610), Julie A. Jacko (Ed.). Springer, Berlin, Heidelberg, Germany, 30–39.Google Scholar
- Mike Bostock. 2019. d3-contour. https://github.com/d3/d3-contour Accessed: June, 2020.Google Scholar
- Michael Bostock, Vadim Ogievetsky, and Jeffrey Heer. 2011. D3: Data-Driven Documents. IEEE Transactions on Visualization and Computer Graphics 17, 12(2011), 2301–2309.Google ScholarDigital Library
- Michael Burch. 2016a. Isoline-Enhanced Dynamic Graph Visualization. In Proceedings of 20th International Conference Information Visualisation, IV, Ebad Banissi, Mark W. McK. Bannatyne, Fatma Bouali, Remo Burkhard, John Counsell, Urska Cvek, Martin J. Eppler, Georges G. Grinstein, Weidong Huang, Sebastian Kernbach, Chun-Cheng Lin, Feng Lin, Francis T. Marchese, Chi Man Pun, Muhammad Sarfraz, Marjan Trutschl, Anna Ursyn, Gilles Venturini, Theodor G. Wyeld, and Jian J. Zhang (Eds.). IEEE, Washington, DC, USA, 1–8.Google Scholar
- Michael Burch. 2016b. Time-Preserving Visual Attention Maps. In Proceedings of Intelligent Decision Technologies: Smart Innovation, Systems and Technologies, Ireneusz Czarnowski, Alfonso Mateos Caballero, Robert J. Howlett, and Lakhmi C. Jain (Eds.), Vol. 57. Springer, Cham, Switzerland, 273–283.Google ScholarCross Ref
- Michael Burch. 2017. Mining and visualizing eye movement data. In Proceedings of the Symposium on Visualization, SIGGRAPH ASIA, Koji Koyamada and Puripant Ruchikachorn (Eds.). Association for Computing Machinery, New York, NY, USA, 3:1–3:8.Google ScholarDigital Library
- Michael Burch, Ayush Kumar, and Neil Timmermans. 2019a. An interactive web-based visual analytics tool for detecting strategic eye movement patterns. In Proceedings of the 11th Association for Computing Machinery Symposium on Eye Tracking Research & Applications, ETRA, Krzysztof Krejtz and Bonita Sharif (Eds.). Association for Computing Machinery, New York, NY, USA, 93:1–93:5.Google Scholar
- Michael Burch, Alberto Veneri, and Bangjie Sun. 2019b. EyeClouds: A Visualization and Analysis Tool for Exploring Eye Movement Data. In Proceedings of the 12th International Symposium on Visual Information Communication and Interaction, VINCI. Association for Computing Machinery, New York, NY, USA, 8:1–8:8.Google ScholarDigital Library
- Michael Burch, Alberto Veneri, and Bangjie Sun. 2020. Exploring eye movement data with image-based clustering. Journal of Visualization 23, 4 (2020), 677–694.Google ScholarDigital Library
- Wen-Ying Sylvia Chou, Neha Trivedi, Emily Peterson, Anna Gaysynsky, Mindy Krakow, and Emily Vraga. 2020. How do social media users process cancer prevention messages on Facebook? An eye-tracking study. Patient Education and Counseling 103, 6 (2020), 1161–1167.Google ScholarCross Ref
- Çagatay Demiralp, Jesse Cirimele, Jeffrey Heer, and Stuart K. Card. 2015. The VERP Explorer: A Tool for Exploring Eye Movements of Visual-Cognitive Tasks Using Recurrence Plots. In Proceedings of Workshop on Eye Tracking and Visualization, ETVIS(Mathematics and Visualization), Michael Burch, Lewis L. Chuang, Brian D. Fisher, Albrecht Schmidt, and Daniel Weiskopf(Eds.). Springer, Cham, Switzerland, 41–55.Google Scholar
- Bryan Farnsworth. 2019. 10 Free Eye Tracking Software Programs [Pros and Cons]. https://imotions.com/blog/free-eye-tracking-software/ Accessed: May, 2020.Google Scholar
- Joseph H. Goldberg and Jonathan I. Helfman. 2010. Visual scanpath representation. In Proceedings of the Symposium on Eye-Tracking Research & Applications, ETRA, Carlos Hitoshi Morimoto, Howell O. Istance, Aulikki Hyrskykari, and Qiang Ji (Eds.). Association for Computing Machinery, New York, NY, USA, 203–210.Google Scholar
- Weidong Huang, Peter Eades, and Seok-Hee Hong. 2009. A graph reading behavior: Geodesic-path tendency. In Proceedings of the IEEE Pacific Visualization Symposium, PacificVis, Peter Eades, Thomas Ertl, and Han-Wei Shen(Eds.). IEEE, Washington, DC, USA, 137–144.Google ScholarDigital Library
- Andrew Lee James Tamplin. 2011. Firebase. https://firebase.google.com/ Accessed: November, 2020.Google Scholar
- Kuno Kurzhals, Florian Heimerl, and Daniel Weiskopf. 2014. ISeeCube: visual analysis of gaze data for video. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA, Pernilla Qvarfordt and Dan Witzner Hansen (Eds.). Association for Computing Machinery, New York, NY, USA, 43–50.Google Scholar
- Kuno Kurzhals, Marcel Hlawatsch, Florian Heimerl, Michael Burch, Thomas Ertl, and Daniel Weiskopf. 2016. Gaze Stripes: Image-Based Visualization of Eye Tracking Data. IEEE Transactions on Visualization and Computer Graphics 22, 1(2016), 1005–1014.Google ScholarDigital Library
- Jinchao Li and Ali H. Sayed. 2012. Modeling bee swarming behavior through diffusion adaptation with asymmetric information sharing. EURASIP Journal on Advances in Signal Processing 18, 2012(2012), 17 pages. https://doi.org/10.1186/1687-6180-2012-18Google Scholar
- Carsten Maple. 2003. Geometric Design and Space Planning Using the Marching Squares and Marching Cube Algorithms. In Proceedings of International Conference on Geometric Modeling and Graphics, GMAG. IEEE, Washington, DC, USA, 90–95.Google ScholarCross Ref
- Margaret W. Matlin and Thomas A. Farmer. 2017. Cognition (9 ed.). Wiley, Hobokon, NJ, USA.Google Scholar
- Raphael Menges, Sophia Kramer, Stefan Hill, Marius Nisslmueller, Chandan Kumar, and Steffen Staab. 2020. A Visualization Tool for Eye Tracking Data Analysis in the Web. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA, Andreas Bulling, Anke Huckauf, Eakta Jain, Ralph Radach, and Daniel Weiskopf (Eds.). Association for Computing Machinery, New York, NY, USA, 46:1–46:5.Google ScholarDigital Library
- Rudolf Netzel, Bettina Ohlhausen, Kuno Kurzhals, Robin Woods, Michael Burch, and Daniel Weiskopf. 2017. User performance and reading strategies for metro maps: An eye tracking study. Spatial Cognition & Computation 17, 1-2 (2017), 39–64.Google ScholarCross Ref
- Hai Nguyen. 2013. Material-UI: A popular React UI framework. https://material-ui.com/ Accessed: November, 2020.Google Scholar
- Travis Oliphant. 2012. Numba: A high performance Python compiler. http://numba.pydata.org/ Accessed: November, 2020.Google Scholar
- Kirill Ragozin and Kai Kunze. 2019. Dyslexic and private reader: an eye-tracking platform for reading interactions with applications to increase empathy and privacy. In Proceedings of the Association for Computing Machinery International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the Association for Computing Machinery International Symposium on Wearable Computers, UbiComp/ISWC, Robert Harle, Katayoun Farrahi, and Nicholas D. Lane (Eds.). Association for Computing Machinery, New York, NY, USA, 296–297.Google Scholar
- Sebastián Ramírez. 2019. FastAPI. https://fastapi.tiangolo.com/.Google Scholar
- Michael Raschke, Dominik Herr, Tanja Blascheck, Thomas Ertl, Michael Burch, Sven Willmann, and Michael Schrauf. 2014. A visual approach for scan path comparison. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA, Pernilla Qvarfordt and Dan Witzner Hansen (Eds.). Association for Computing Machinery, New York, NY, USA, 339–346.Google Scholar
- Jonathan C. Roberts. 2003. Guest editor’s introduction: special issue on coordinated and multiple views in exploratory visualization. Information Visualization 2, 4 (2003), 199–200.Google ScholarDigital Library
- Erin Robertson and Jennifer Gallant. 2019. Eye tracking reveals subtle spoken sentence comprehension problems in children with dyslexia. Lingua 228(2019), 102708.Google ScholarCross Ref
- Ruth Rosenholtz, Yuanzhen Li, Jonathan Mansfield, and Zhenlan Jin. 2005. Feature congestion: a measure of display clutter. In Proceedings of the Conference on Human Factors in Computing Systems, CHI, Gerrit C. van der Veer and Carolyn Gale (Eds.). Association for Computing Machinery, New York, NY, USA, 761–770.Google ScholarDigital Library
- Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-Tracking Protocols. In Proceedings of the Symposium on Eye Tracking Research & Applications. Association for Computing Machinery, New York, NY, USA, 71–78.Google Scholar
- Jeff Sauro and Joseph S. Dumas. 2009. Comparison of Three One-Question, Post-Task Usability Questionnaires. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1599––1608. https://doi.org/10.1145/1518701.1518946Google ScholarDigital Library
- Robert Spence and Mark Witkowski. 2013. Rapid Serial Visual Presentation - Design for Cognition. Springer, London, UK.Google Scholar
- Michael K. Tanenhaus. 2007. Eye Movements and spoken language processing. In Eye Movements, Roger P.G. Van Gompel, Martin H. Fischer, Wayne S. Murray, and Robin L. Hill (Eds.). Elsevier, Oxford, 443–470.Google Scholar
- TechEmpower. 2019. TechEmpower Framework Benchmarks. https://www.techempower.com/benchmarks/#section=data-r18&hw=ph&test=query&l=zijzen-7Accessed: November, 2020.Google Scholar
- Barbara Tversky, Julie Bauer Morrison, and Mireille Bétrancourt. 2002. Animation: can it facilitate?International Journal of Human Computer Studies 57, 4 (2002), 247–262.Google Scholar
- Nicholas J. Wade. 2007. Scanning the seen: Vision and the origins of eye-movement research. In Eye Movements, Roger P.G. Van Gompel, Martin H. Fischer, Wayne S. Murray, and Robin L. Hill (Eds.). Elsevier, Oxford, 31–63.Google Scholar
- Jordan Walke. 2013. React - A JavaScript library for building user interfaces. https://reactjs.org/ Accessed: November, 2020.Google Scholar
- Yunhai Wang, Jian Zhang, Dirk J. Lehmann, Holger Theisel, and Xuebin Chi. 2012. Automating Transfer Function Design with Valley Cell-Based Clustering of 2D Density Plots. Computer Graphics Forum 31, 3 (2012), 1295–1304.Google ScholarDigital Library
- Max Wertheimer and Kurt Riezler. 1944. Gestalt Theory. Social Research 11, 1 (1944), 78–99.Google Scholar
- Chia-Kai Yang and Chat Wacharamanotham. 2018. Alpscarf: Augmenting Scarf Plots for Exploring Temporal Gaze Patterns. In Proceedings of Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, CHI, Regan L. Mandryk, Mark Hancock, Mark Perry, and Anna L. Cox (Eds.). Association for Computing Machinery, New York, NY, USA, 1–6.Google ScholarDigital Library
- Ji Soo Yi, Youn ah Kang, John T. Stasko, and Julie A. Jacko. 2007. Toward a Deeper Understanding of the Role of Interaction in Information Visualization. IEEE Transactions on Visualization and Computer Graphics 13, 6(2007), 1224–1231.Google ScholarDigital Library
- The Power of Linked Eye Movement Data Visualizations
Recommendations
Linked and Coordinated Visual Analysis of Eye Movement Data
ETRA '22: 2022 Symposium on Eye Tracking Research and ApplicationsEye movement data can be used for a variety of research in marketing, advertisement, and other design-related industries to gain interesting insights into customer preferences. However, interpreting such data can be a challenging task due to its spatio-...
Constructing Interactive Visualizations with iVoLVER
CHI EA '16: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing SystemsiVoLVER, the Interactive Visual Language for Visualization Extraction and Reconstruction, is a web-based pen and touch system that graphically supports the construction of interactive visualizations and allows the extraction of data from different types ...
Mining and visualizing eye movement data
SA '17: SIGGRAPH Asia 2017 Symposium on VisualizationEye movement data has a spatio-temporal nature which makes the design of suitable visualization techniques a challenging task. Moreover, eye movement data is typically recorded by tracking the eyes of various study participants in order to achieve ...
Comments