skip to main content
10.1145/2578153.2578158acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

ISeeCube: visual analysis of gaze data for video

Published: 26 March 2014 Publication History

Abstract

We introduce a new design for the visual analysis of eye tracking data recorded from dynamic stimuli such as video. ISeeCube includes multiple coordinated views to support different aspects of various analysis tasks. It combines methods for the spatiotemporal analysis of gaze data recorded from unlabeled videos as well as the possibility to annotate and investigate dynamic Areas of Interest (AOIs). A static overview of the complete data set is provided by a space-time cube visualization that shows gaze points with density-based color mapping and spatiotemporal clustering of the data. A timeline visualization supports the analysis of dynamic AOIs and the viewers' attention on them. AOI-based scanpaths of different viewers can be clustered by their Levenshtein distance, an attention map, or the transitions between AOIs. With the provided visual analytics techniques, the exploration of eye tracking data recorded from several viewers is supported for a wide range of analysis tasks.

References

[1]
André, P., Wilson, M. L., Russell, A., Smith, D. A., Owens, A., and schraefel, m. 2007. Continuum: designing timelines for hierarchies, relationships and scale. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, 101--110.
[2]
Andrienko, G., Andrienko, N., Burch, M., and Weiskopf, D. 2012. Visual analytics methodology for eye movement studies. IEEE Transactions on Visualization and Computer Graphics 18, 12, 2889--2898.
[3]
Burch, M., Konevtsova, N., Heinrich, J., Höferlin, M., and Weiskopf, D. 2011. Evaluation of traditional, orthogonal, and radial tree diagrams by an eye tracking study. IEEE Transactions on Visualization and Computer Graphics 17, 12, 2440--2448.
[4]
Doermann, D., and Mihalcik, D. 2000. Tools and techniques for video performance evaluation. In Proceedings of the 15th International Conference on Pattern Recognition, 167--170.
[5]
Duchowski, A., and McCormick, B. 1998. Gaze-contingent video resolution degradation. In Proceedings of Photonics West'98 Electronic Imaging, 318--329.
[6]
Duchowski, A. T. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4, 455--470.
[7]
Goldstein, R., Woods, R., and Peli, E. 2007. Where people look when watching movies: Do all viewers look at the same place? Computers in Biology and Medicine 37, 7, 957--964.
[8]
Harrower, M., and Brewer, C. 2003. ColorBrewer.org: an online tool for selecting colour schemes for maps. The Cartographic Journal 40, 1, 27--37.
[9]
Hastie, T., Tibshirani, R., and Friedman, J. 2009. The Elements of Statistical learning: data mining, inference and prediction, 2 ed. Springer.
[10]
Holmqvist, K., Nystrm, M., Andersson, R., Dewhurst, R., Halszka, J., and van de Weijer, J. 2011. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press.
[11]
Kurzhals, K., and Weiskopf, D. 2013. Space-time visual analytics of eye-tracking data for dynamic stimuli. IEEE Transactions on Visualization and Computer Graphics 19, 12, 2129--2138.
[12]
Lessing, S., and Linge, L. 2002. IICap: A new environment for eye tracking data analysis. Master's thesis. University of Lund, Sweden.
[13]
Levenshtein, V. 1966. Binary codes capable of correcting deletions, insertions, and reversals. Soviet Physics-Doklady 10, 8, 707--710.
[14]
Li, X., Çöltekin, A., and Kraak, M.-J. 2010. Visual exploration of eye movement data using the space-time-cube. In Proceedings of the 6th International Conference on Geographic Information Science, 295--309.
[15]
Marchant, P., Raybould, D., Renshaw, T., and Stevens, R. 2009. Are you seeing what I'm seeing? An eye-tracking evaluation of dynamic scenes. Digital Creativity 20, 3, 153--163.
[16]
Mital, P., Smith, T., Hill, R., and Henderson, J. 2011. Clustering of gaze during dynamic scene viewing is predicted by motion. Cognitive Computation 3, 1, 5--24.
[17]
Poole, A., and Ball, L. J. 2006. Eye tracking in HCI and usability research. In Encyclopedia of Human Computer Interaction, C. Ghaoui, Ed. Idea Group Reference, 211--219.
[18]
Richardson, D. C., and Dale, R. 2005. Looking to understand: The coupling between speakers' and listeners' eye movements and its relationship to discourse comprehension. Cognitive Science 29, 6, 1045--1060.
[19]
Ristovski, G., Hunter, M., Olk, B., and Linsen, L. 2013. EyeC: Coordinated views for interactive visual exploration of eye-tracking data. In Proceedings of the 17th International Conference on Information Visualisation, 239--248.
[20]
Santella, A., and DeCarlo, D. 2004. Robust clustering of eye movement recordings for quantification of visual interest. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, 27--34.
[21]
Smith, T., and Henderson, J. 2008. Attentional synchrony in static and dynamic scenes. Journal of Vision 8, 6, 773--773.
[22]
Stellmach, S., Nacke, L., and Dachselt, R. 2010. Advanced gaze visualizations for three-dimensional virtual environments. In Proceedings of the 2010 Symposium on Eye Tracking Research & Applications, 109--112.
[23]
Thomas, J. J., and Cook, K. A. 2005. Illuminating the Path: The Research and Development Agenda for Visual Analytics. IEEE Computer Society Press.
[24]
Tien, G., Atkins, M. S., and Zheng, B. 2012. Measuring gaze overlap on videos between multiple observers. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications, 309--312.
[25]
Tsang, H. Y., Tory, M., and Swindells, C. 2010. eSeeTrack-Visualizing sequential fixation patterns. IEEE Transactions on Visualization and Computer Graphics 16, 6, 953--962.
[26]
Weibel, N., Fouse, A., Emmenegger, C., Kimmich, S., and Hutchins, E. 2012. Let's look at the cockpit: exploring mobile eye-tracking for observational research on the flight deck. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications, 107--114.
[27]
West, J. M., Haake, A. R., Rozanski, E. P., and Karn, K. S. 2006. eyePatterns: software for identifying patterns and similarities across fixation sequences. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, 149--154.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)ChoreoVis: Planning and Assessing Formations in Dance ChoreographiesComputer Graphics Forum10.1111/cgf.1510443:3Online publication date: 10-Jun-2024
  • (2024)Bridging Quantitative and Qualitative Methods for Visualization Research: A Data/Semantics Perspective in Light of Advanced AI2024 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV)10.1109/BELIV64461.2024.00019(119-128)Online publication date: 14-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '14: Proceedings of the Symposium on Eye Tracking Research and Applications
March 2014
394 pages
ISBN:9781450327510
DOI:10.1145/2578153
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. dynamic areas of interest
  2. eye tracking
  3. space-time cube
  4. visual analytics

Qualifiers

  • Research-article

Funding Sources

Conference

ETRA '14
ETRA '14: Eye Tracking Research and Applications
March 26 - 28, 2014
Florida, Safety Harbor

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)82
  • Downloads (Last 6 weeks)3
Reflects downloads up to 01 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2024)ChoreoVis: Planning and Assessing Formations in Dance ChoreographiesComputer Graphics Forum10.1111/cgf.1510443:3Online publication date: 10-Jun-2024
  • (2024)Bridging Quantitative and Qualitative Methods for Visualization Research: A Data/Semantics Perspective in Light of Advanced AI2024 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV)10.1109/BELIV64461.2024.00019(119-128)Online publication date: 14-Oct-2024
  • (2023)Interactive Fixation-to-AOI Mapping for Mobile Eye Tracking Data based on Few-Shot Image ClassificationCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584179(175-178)Online publication date: 27-Mar-2023
  • (2022)A Spiral into the MindProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/35307955:2(1-16)Online publication date: 17-May-2022
  • (2022)Seeking Patterns of Visual Pattern Discovery for Knowledge BuildingComputer Graphics Forum10.1111/cgf.1451541:6(124-148)Online publication date: 25-May-2022
  • (2022)EyeBoxProcedia Computer Science10.1016/j.procs.2022.03.024201:C(166-173)Online publication date: 1-Jan-2022
  • (2022)Gaze-driven placement of items for proactive visual explorationJournal of Visualization10.1007/s12650-021-00808-525:3(613-633)Online publication date: 1-Jun-2022
  • (2020)Context-aware placement of items with gaze-based interactionProceedings of the 13th International Symposium on Visual Information Communication and Interaction10.1145/3430036.3430059(1-8)Online publication date: 8-Dec-2020
  • (2020)Demo of the EyeSAC System for Visual Synchronization, Cleaning, and Annotation of Eye Movement DataACM Symposium on Eye Tracking Research and Applications10.1145/3379157.3391988(1-3)Online publication date: 2-Jun-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media