skip to main content
10.1145/3204493.3204528acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Capturing real-world gaze behaviour: live and unplugged

Published: 14 June 2018 Publication History

Abstract

Understanding human gaze behaviour has benefits from scientific understanding to many application domains. Current practices constrain possible use cases, requiring experimentation restricted to a lab setting or controlled environment. In this paper, we demonstrate a flexible unconstrained end-to-end solution that allows for collection and analysis of gaze data in real-world settings. To achieve these objectives, rich 3D models of the real world are derived along with strategies for associating experimental eye-tracking data with these models. In particular, we demonstrate the strength of photogrammetry in allowing these capabilities to be realized, and demonstrate the first complete solution for 3D gaze analysis in large-scale outdoor environments using standard camera technology without fiducial markers. The paper also presents techniques for quantitative analysis and visualization of 3D gaze data. As a whole, the body of techniques presented provides a foundation for future research, with new opportunities for experimental studies and computational modeling efforts.

References

[1]
L AgiSoft. 2014. Agisoft photoscan. Professional Edition (2014).
[2]
Javier Civera, Andrew J Davison, and JM Martinez Montiel. 2008. Inverse depth parametrization for monocular SLAM. IEEE transactions on robotics 24, 5 (2008), 932--945.
[3]
Dariush Derakhshani. 2012. Introducing Autodesk Maya 2013. John Wiley & Sons.
[4]
Thomas Driemeyer. 2005. Rendering with mental ray®(mental ray® Handbooks). Springer-Verlag New York, Inc., Secaucus, NJ.
[5]
Jakob Engel, Thomas Schöps, and Daniel Cremers. 2014. LSD-SLAM: Large-scale direct monocular SLAM. In European Conference on Computer Vision. Springer, 834--849.
[6]
Zachary Raymond Ernst, Geoffrey M Boynton, and Mehrdad Jazayeri. 2013. The spread of attention across features of a surface. Journal of Neurophysiology 110, 10 (2013), 2426--2439.
[7]
Yasutaka Furukawa, Brian Curless, Steven M Seitz, and Richard Szeliski. 2010. Towards internet-scale multi-view stereo. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 1434--1441.
[8]
Yasutaka Furukawa and Jean Ponce. 2010. Accurate, dense, and robust multiview stereopsis. IEEE Transactions on Pattern Analysis and Machine Intelligence 32, 8 (2010), 1362--1376.
[9]
Daniel Girardeau-Montaut. 2011. Cloud compare-open source project. OpenSource Project (2011).
[10]
Gaël Guennebaud and Markus Gross. 2007. Algebraic point set surfaces. In ACM Transactions on Graphics (TOG), Vol. 26. ACM, 23.
[11]
Roland Hess. 2007. The essential Blender: guide to 3D creation with the open source suite Blender. No Starch Press.
[12]
VANESSA Hugot. 2007. Eye gaze analysis in human-human interactions. Unpublished master's thesis, CSC, KTH, Sweden (2007).
[13]
SensoMotoric Instruments. 2012. Smi eye tracking glasses. (2012).
[14]
SMI-SensoMotoric Instruments. 2009. Begaze 2.2 manual. Germany: Teltow (2009).
[15]
Heather Jordan and Steven P Tipper. 1999. Spread of inhibition across an object's surface. British Journal of Psychology 90, 4 (1999), 495--507.
[16]
Peter Kovesi. 2015. Good Colour Maps: How to Design Them. CoRR abs/1509.03700 (2015). http://arxiv.org/abs/1509.03700
[17]
Solomon Kullback and Richard A Leibler. 1951. On information and sufficiency. The annals of mathematical statistics 22, 1 (1951), 79--86.
[18]
Kuno Kurzhals, Florian Heimerl, and Daniel Weiskopf. 2014. ISeeCube: Visual analysis of gaze data for video. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 43--50.
[19]
JM Liu. 1982. Simple technique for measurements of pulsed Gaussian-beam spot sizes. Optics letters 7, 5 (1982), 196--198.
[20]
Francesco Mancini, Marco Dubbini, Mario Gattelli, Francesco Stecchi, Stefano Fabbri, and Giovanni Gabbianelli. 2013. Using unmanned aerial vehicles (UAV) for high-resolution reconstruction of topography: the structure from motion approach on coastal environments. Remote Sensing 5, 12 (2013), 6880--6898.
[21]
Susana Martinez-Conde, Stephen L Macknik, and David H Hubel. 2004. The role of fixational eye movements in visual perception. Nature Reviews Neuroscience 5, 3 (2004), 229--240.
[22]
Cathleen M Moore and Christopher Fulton. 2005. The spread of attention to hidden portions of occluded surfaces. Psychonomic Bulletin & Review 12, 2 (2005), 301--306.
[23]
Pierre Moulon and Alessandro Bezzi. 2011. Python photogrammetry toolbox: a free solution for three-dimensional documentation. In ArcheoFoss. 1--12.
[24]
Raul Mur-Artal and Juan D Tardós. 2017. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Transactions on Robotics 33, 5 (2017), 1255--1262.
[25]
Jacob L Orquin and Simone Mueller Loose. 2013. Attention and choice: A review on eye movements in decision making. Acta psychologica 144, 1 (2013), 190--206.
[26]
Lucas Paletta, Katrin Santner, Gerald Fritz, Heinz Mayer, and Johann Schrammel. 2013. 3D attention: measurement of visual saliency using eye tracking glasses. In CHI'13 Extended Abstracts on Human Factors in Computing Systems. ACM, 199--204.
[27]
Thies Pfeiffer and Cem Memili. 2015. GPU-accelerated attention map generation for dynamic 3D scenes. In Virtual Reality (VR). IEEE, 257--258.
[28]
Thies Pfeiffer and Cem Memili. 2016. Model-based real-time visualization of realistic three-dimensional heat maps for mobile eye tracking and eye tracking in virtual reality. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 95--102.
[29]
Thies Pfeiffer and Patrick Renner. 2014. EyeSee3D: a low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technology. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 369--376.
[30]
Thies Pfeiffer, Patrick Renner, and Nadine Pfeiffer-Leßmann. 2016. EyeSee3D 2.0: model-based real-time analysis of mobile eye-tracking in static and dynamic three-dimensional scenes. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 189--196.
[31]
Mahmoud Qodseya, Marta Sanzari, Valsamis Ntouskos, and Fiora Pirri. 2016. A3D: A Device for Studying Gaze in 3D. In European Conference on Computer Vision. Springer, 572--588.
[32]
Alfréd Rényi et al. 1961. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics. The Regents of the University of California.
[33]
Sophie Stellmach, Lennart Nacke, and Raimund Dachselt. 2010. 3d attentional maps: aggregated gaze visualizations in three-dimensional virtual environments. In Proceedings of the international conference on advanced visual interfaces. ACM, 345--348.
[34]
Jūratė Sužiedelytė-Visockienė, Renata Bagdžiūnaitė, Naglis Malys, and Vida Maliene. 2015. Close-range photogrammetry enables documentation of environment-induced deformation of architectural heritage. Environmental Engineering and Management Journal 14, 6 (2015), 1371--1381.
[35]
Sebastian Thrun and John J Leonard. 2008. Simultaneous localization and mapping. In Springer handbook of robotics. Springer, 871--889.
[36]
Erik Wästlund, Poja Shams, Martin Löfgren, Lars Witell, and Anders Gustafsson. 2010. Consumer Perception at Point of Purchase: Evaluating Proposed Package Designs in an Eye-tracking Lab. Journal of Business & Retail Management Research 5, 1 (2010), 41--50.
[37]
Changchang Wu. 2007. SiftGPU: A GPU implementation of scale invariant feature transform (SIFT). (2007).
[38]
Changchang Wu et al. 2011. VisualSFM: A visual structure from motion system. (2011).
[39]
Kiwon Yun, Yifan Peng, Dimitris Samaras, Gregory J Zelinsky, and Tamara L Berg. 2013. Studying relationships between human gaze, description, and computer vision. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 739--746.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2023)Object-based inhibition of return in three-dimensional space: From simple drawings to real objectsJournal of Vision10.1167/jov.23.13.723:13(7)Online publication date: 16-Nov-2023
  • (2022)A Systematic Review of Visualization Techniques and Analysis Tools for Eye-Tracking in 3D EnvironmentsFrontiers in Neuroergonomics10.3389/fnrgo.2022.9100193Online publication date: 13-Jul-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
June 2018
595 pages
ISBN:9781450357067
DOI:10.1145/3204493
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3D modeling
  2. eye tracking
  3. gaze analysis
  4. gaze visualization

Qualifiers

  • Research-article

Conference

ETRA '18

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)2
Reflects downloads up to 02 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2023)Object-based inhibition of return in three-dimensional space: From simple drawings to real objectsJournal of Vision10.1167/jov.23.13.723:13(7)Online publication date: 16-Nov-2023
  • (2022)A Systematic Review of Visualization Techniques and Analysis Tools for Eye-Tracking in 3D EnvironmentsFrontiers in Neuroergonomics10.3389/fnrgo.2022.9100193Online publication date: 13-Jul-2022
  • (2019)Indoor human localization based on the corneal reflection of illuminationProceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3338286.3344388(1-6)Online publication date: 1-Oct-2019
  • (2019)Semantic gaze labeling for human-robot shared manipulationProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3314111.3319840(1-9)Online publication date: 25-Jun-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media