ABSTRACT
Using human fixation behavior, we can interfere regions that require to be processed at high resolution and where stronger compression can be favored. Analyzing the visual scan path solely based on a predefined set of regions of interest (ROIs) limits the exploration room of the analysis. Insights can only be gained for those regions that the data analyst considered worthy of labeling. Furthermore, visual exploration is naturally time-dependent: A short initial overview phase may be followed by an in-depth analysis of regions that attracted the most attention. Therefore, the shape and size of regions of interest may change over time. Automatic ROI generation can help in automatically reshaping the ROIs to the data of a time slice. We developed three novel methods for automatic ROI generation and show their applicability to different eye tracking data sets. The methods are publicly available as part of the EyeTrace software at http://www.ti.uni-tuebingen.de/Eyetrace.175L0.html
- Bonnie Auyeung, Michael V Lombardo, Markus Heinrichs, Bhisma Chakrabarti, A Sule, Julia Brynja Deakin, RAI Bethlehem, L Dickens, Natasha Mooney, JAN Sipple, et al. 2015. Oxytocin increases eye contact during a real-time, naturalistic social interaction in males with and without autism. Translational psychiatry 5, 2 (2015), e507.Google Scholar
- T Blascheck, K Kurzhals, M Raschke, M Burch, D Weiskopf, and T Ertl. 2014. State-of-the-art of visualization for eye tracking data. In Proceedings of EuroVis, Vol. 2014.Google Scholar
- Tanja Blascheck, Michael Raschke, and Thomas Ertl. 2013. Circular heat map transition diagram. In Proceedings of the 2013 Conference on Eye Tracking South Africa. ACM, 58--61. Google ScholarDigital Library
- L. Commare and H. Brinkmann. 2016. Aesthetic Echoes in the Beholder's Eye? Empirical evidence for the divergence of theory and practice in the perception of abstract art.. In M. Zimmermann (Ed.): Vision in Motion. Streams of Sensation and Configurations of Time. Diaphanes.Google Scholar
- Andrew T Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 455--470.Google ScholarCross Ref
- Joseph H Goldberg, Mark J Stimson, Marion Lewenstein, Neil Scott, and Anna M Wichansky. 2002. Eye tracking in web search tasks: design implications. In Proceedings of the 2002 symposium on Eye tracking research & applications. ACM, 51--58. Google ScholarDigital Library
- Thomas E Hutchinson, K Preston White, Worthy N Martin, Kelly C Reichert, and Lisa A Frey. 1989. Human-computer interaction using eye-gaze input. IEEE Transactions on systems, man, and cybernetics 19, 6 (1989), 1527--1534.Google ScholarCross Ref
- Thomas C Kübler, Katrin Sippel, Wolfgang Fuhl, Guilherme Schievelbein, Johanna Aufreiter, Raphael Rosenberg, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015. Analysis of eye movements with Eyetrace. In Biomedical Engineering Systems and Technologies. Springer, 458--471.Google Scholar
- Radoslaw Mantiuk, Bartosz Bazyluk, and Rafal K. Mantiuk. 2013. Gaze-driven Object Tracking for Real Time Rendering. Computer Graphics Forum 32, 2 (2013), 163--173.Google ScholarCross Ref
- Franco Mawad, Marcela Trías, Ana Giménez, Alejandro Maiche, and Gastón Ares. 2015. Influence of cognitive style on information processing and selection of yogurt labels: Insights from an eye-tracking study. Food Research International 74 (2015), 1--9.Google ScholarCross Ref
- Marcus Nyström. 2008. Off-line Foveated Compression and Scene Perception: An Eye-Tracking Approach. Ph.D. Dissertation. Lund University.Google Scholar
- Alice Oh, Harold Fox, Max Van Kleek, Aaron Adler, Krzysztof Gajos, Louis-Philippe Morency, and Trevor Darrell. 2002. Evaluating look-to-talk: a gaze-aware interface in a collaborative environment. In CHI'02 Extended Abstracts on Human Factors in Computing Systems. ACM, 650--651. Google ScholarDigital Library
- Bing Pan, Helene A Hembrooke, Geri K Gay, Laura A Granka, Matthew K Feusner, and Jill K Newman. 2004. The determinants of web page viewing behavior: an eye-tracking study. In Proceedings of the 2004 symposium on Eye tracking research & applications. ACM, 147--154. Google ScholarDigital Library
- Claudio M Privitera and Lawrence W Stark. 1998. Evaluating image processing algorithms that predict regions of interest. Pattern Recognition Letters 19, 11 (1998), 1037--1043. Google ScholarDigital Library
- Claudio M Privitera and Lawrence W Stark. 2000. Algorithms for defining visual regions-of-interest: Comparison with eye fixations. Pattern Analysis and Machine Intelligence, IEEE Transactions on 22, 9 (2000), 970--982. Google ScholarDigital Library
- public domain. 1910. Wassily Kandinski first abstract watercolor. https://en.wikipedia.org/wiki/File:First_abstract_watercolor_kandinsky_1910.jpg. First abstract water-color, painted by Wassily Kandinsky, 1910.Google Scholar
- Raphael Rosenberg. 2014. Blicke messen: Vorschläge für eine empirische bildwissenschaft. Jahrbuch der Bayerischen Akademie der Schönen Künste 27 (2014), 71--86.Google Scholar
- Anthony Santella and Doug DeCarlo. 2004. Robust clustering of eye movement recordings for quantification of visual interest. In Proceedings of the 2004 symposium on Eye tracking research & applications. ACM, 27--34. Google ScholarDigital Library
- Benjamin Strobel, Marlit Annalena Lindner, Steffani Saß, and Olaf Köller. 2018. Task-irrelevant data impair processing of graph reading tasks: An eye tracking study. Learning and Instruction (2018).Google Scholar
- E. Tafaj, G. Kasneci, W. Rosenstiel, and M. Bogdan. 2012. Bayesian Online Clustering of Eye Movement Data. In Proceedings of the Symposium on ETRA. ACM. Google ScholarDigital Library
- Benjamin W Tatler, Nicholas J Wade, Hoi Kwan, John M Findlay, and Boris MGoogle Scholar
- Velichkovsky. 2010. Yarbus, eye movements, and vision. i-Perception 1, 1 (jan 2010), 7--27.Google Scholar
- Dereck Toker, Cristina Conati, Ben Steichen, and Giuseppe Carenini. 2013. Individual user characteristics and information visualization: connecting the dots through eye tracking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 295--304. Google ScholarDigital Library
- David S Wooding. 2002. Eye movements of large populations: II. Deriving regions of interest, coverage, and similarity using fixation maps. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 518--528.Google ScholarCross Ref
Index Terms
- Region of interest generation algorithms for eye tracking data
Recommendations
Region-of-Interest-Based Subtitle Placement Using Eye-Tracking Data of Multiple Viewers
TVX '16: Proceedings of the ACM International Conference on Interactive Experiences for TV and Online VideoWe present a subtitle-placement method that reduces viewer's eye movement without interfering with the target region of interest (ROI) in a video scene. Subtitles help viewers understand foreign-language videos. However, subtitles tend to attract ...
Benchmark data for evaluating visualization and analysis techniques for eye tracking for video stimuli
BELIV '14: Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for VisualizationFor the analysis of eye movement data, an increasing number of analysis methods have emerged to examine and analyze different aspects of the data. In particular, due to the complex spatio-temporal nature of gaze data for dynamic stimuli, there has been ...
EyeFIX: An Interactive Visual Analytics Interface for Eye Movement Analysis
VINCI '21: Proceedings of the 14th International Symposium on Visual Information Communication and InteractionEye movements are closely related to the cognitive processes and often act as a window to the brain and mind. To facilitate a window to the brain using eye movements, we propose EyeFIX, an interactive visual analytics interface. Our interface currently ...
Comments