skip to main content
10.1145/3314111.3319918acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Reducing calibration drift in mobile eye trackers by exploiting mobile phone usage

Published: 25 June 2019 Publication History

Abstract

Automatic saliency-based recalibration is promising for addressing calibration drift in mobile eye trackers but existing bottom-up saliency methods neglect user's goal-directed visual attention in natural behaviour. By inspecting real-life recordings of egocentric eye tracker cameras, we reveal that users are likely to look at their phones once these appear in view. We propose two novel automatic recalibration methods that exploit mobile phone usage: The first builds saliency maps using the phone location in the egocentric view to identify likely gaze locations. The second uses the occurrence of touch events to recalibrate the eye tracker, thereby enabling privacy-preserving recalibration. Through in-depth evaluations on a recent mobile eye tracking dataset (N=17, 65 hours) we show that our approaches outperform a state-of-the-art saliency approach for automatic recalibration. As such, our approach improves mobile eye tracking and gaze-based interaction, particularly for long-term use.

References

[1]
Fares Alnajar, Theo Gevers, Roberto Valenti, and Sennay Ghebreab. 2013. Calibration-Free Gaze Estimation Using Human Gaze Patterns. In Proc. of the IEEE International Conference on Computer Vision. 137--144.
[2]
Michael Backes, Markus Dürmuth, and Dominique Unruh. 2008. Compromising Reflections -or- How to Read LCD Monitors around the Corner. In Proc. of the IEEE Symposium on Security and Privacy. 158--169.
[3]
Ali Borji and Laurent Itti. 2013. State-of-the-Art in Visual Attention Modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence 35, 1 (2013), 185--207.
[4]
Andreas Bulling and Hans Gellersen. 2010. Toward Mobile Eye-Based Human-Computer Interaction. IEEE Pervasive Computing 9, 4 (2010), 8--12.
[5]
Daniel Buschek, Benjamin Bisinger, and Florian Alt. 2018. ResearchIME: A Mobile Keyboard Application for Studying Free Typing Behaviour in the Wild. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems. Article 255, 14 pages.
[6]
Jixu Chen and Qiang Ji. 2015. A Probabilistic Approach to Online Eye Gaze Tracking Without Explicit Personal Calibration. IEEE Transactions on Image Processing 24, 3 (2015), 1076--1086.
[7]
Kai Dierkes, Moritz Kassner, and Andreas Bulling. 2018. A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction. In Proc. of the ACM Symposium on Eye Tracking Research and Applications. Article 9, 9 pages.
[8]
Tilman Dingler and Martin Pielot. 2015. I'll Be There for You: Quantifying Attentiveness Towards Mobile Messaging. In Proc. of the International Conference on Human-Computer Interaction with Mobile Devices and Services. 1--5.
[9]
Andrew T Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 455--470.
[10]
Martin A. Fischler and Robert C. Bolles. 1981. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 24, 6 (1981), 381--395.
[11]
Mary Hayhoe and Dana Ballard. 2005. Eye movements in natural behavior. Trends in cognitive sciences 9, 4 (2005), 188--194.
[12]
Roberto Hoyle, Robert Templeman, Denise Anthony, David Crandall, and Apu Kapadia. 2015. Sensitive Lifelogs: A Privacy Analysis of Photos from Wearable Cameras. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems. 1645--1648.
[13]
Roberto Hoyle, Robert Templeman, Steven Armes, Denise Anthony, David Crandall, and Apu Kapadia. 2014. Privacy Behaviors of Lifeloggers Using Wearable Cameras. In Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing. 571--582.
[14]
Michael Xuelin Huang, Tiffany CK Kwok, Grace Ngai, Stephen CF Chan, and Hong Va Leong. 2016. Building a Personalized, Auto-Calibrating Eye Tracker from User Interactions. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems. 5169--5179.
[15]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 1151--1160.
[16]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2018. The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned. In Proc. of the International Conference on Human-Computer Interaction with Mobile Devices and Services. Article 38, 17 pages.
[17]
Marion Koelle, Katrin Wolf, and Susanne Boll. 2018. Beyond LED Status Lights - Design Requirements of Privacy Notices for Body-worn Cameras. In Proc. of the International Conference on Tangible, Embedded, and Embodied Interaction. 177--187.
[18]
Mohammed Korayem, Robert Templeman, Dennis Chen, David Crandall, and Apu Kapadia. 2016. Enhancing Lifelogging Privacy by Detecting Screens. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems. 4309--4314.
[19]
Christian Lander, Markus Löchtefeld, and Antonio Krüger. 2017. hEYEbrid: A Hybrid Approach for Mobile Calibration-free Gaze Estimation. Proc. of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 4, Article 149 (2017), 29 pages.
[20]
Philipp Müller, Michael Xuelin Huang, Xucong Zhang, and Andreas Bulling. 2018. Robust Eye Contact Detection in Natural Multi-Person Interactions Using Gaze and Speaking Behaviour. In Proc. of the ACM Symposium on Eye Tracking Research and Applications. 31:1--31:10.
[21]
Antti Oulasvirta, Tye Rattenbury, Lingyi Ma, and Eeva Raita. 2012. Habits Make Smartphone Use More Pervasive. Personal and Ubiquitous Computing 16, 1 (2012), 105--114.
[22]
Antti Oulasvirta, Sakari Tamminen, Virpi Roto, and Jaana Kuorelahti. 2005. Interaction in 4-second Bursts: The Fragmented Nature of Attentional Resources in Mobile HCI. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems. 919--928.
[23]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. In Proc. of the ACM Symposium on User Interface Software and Technology. 373--383.
[24]
Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. In Proc. of the ACM Symposium on User Interface Software and Technology. 301--311.
[25]
Martin Pielot, Tilman Dingler, Jose San Pedro, and Nuria Oliver. 2015. When Attention is Not Scarce - Detecting Boredom from Mobile Phone Usage. In Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing. 825--836.
[26]
Alireza Sahami Shirazi, Niels Henze, Tilman Dingler, Martin Pielot, Dominik Weber, and Albrecht Schmidt. 2014. Large-scale Assessment of Mobile Notifications. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems. 3055--3064.
[27]
David Sculley. 2010. Web-scale k-means clustering. In Proc. of the International Conference on World Wide Web. 1177--1178.
[28]
Rémy Siegfried, Yu Yu, and Jean-Marc Odobez. 2017. Towards the Use of Social Interaction Conventions as Prior for Gaze Model Adaptation. In Proc. of ACM International Conference on Multimodal Interaction. ACM.
[29]
Julian Steil, Marion Koelle, Wilko Heuten, Susanne Boll, and Andreas Bulling. 2018a. PrivacEye: Privacy-Preserving First-Person Vision Using Image Features and Eye Movement Analysis. Technical Report.
[30]
Julian Steil, Philipp Müller, Yusuke Sugano, and Andreas Bulling. 2018b. Forecasting User Attention During Everyday Mobile Interactions Using Device-integrated and Wearable Sensors. In Proc. of the International Conference on Human-Computer Interaction with Mobile Devices and Services. Article 1, 13 pages.
[31]
Yusuke Sugano and Andreas Bulling. 2015. Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency. In Proc. of the ACM Symposium on User Interface Software and Technology. 363--372.
[32]
Yusuke Sugano, Yasuyuki Matsushita, and Yoichi Sato. 2010. Calibration-free gaze sensing using saliency maps. In Proc. of the IEEE Conference on Computer Wision and Pattern Recognition. 2667--2674.
[33]
Yusuke Sugano, Yasuyuki Matsushita, Yoichi Sato, and Hideki Koike. 2008. An Incremental Learning Method for Unconstrained Gaze Estimation. In Proc. of the European Conference on Computer Vision. 656--667.
[34]
Lech Swirski and Neil Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3d eye model fitting. In Proc. of the European Conference on Eye Movements.
[35]
Kentaro Takemura, Shunki Kimura, and Sara Suda. 2014a. Estimating point-of-regard using corneal surface image. In Proc. of the ACM Symposium on Eye Tracking Research and Applications. 251--254.
[36]
Kentaro Takemura, Tomohisa Yamakawa, Jun Takamatsu, and Tsukasa Ogasawara. 2014b. Estimation of a focused object using a corneal surface image for eye-based interaction. Journal of Eye Movement Research 7, 3 (2014), 1--9.
[37]
Hirotake Yamazoe, Akira Utsumi, Tomoko Yonezawa, and Shinji Abe. 2008. Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions. In Proc. of the ACM Symposium on Eye Tracking Research and Applications. ACM, 245--250.
[38]
Xucong Zhang, Michael Xuelin Huang, Yusuke Sugano, and Andreas Bulling. 2018. Training Person-Specific Gaze Estimators from User Interactions with Multiple Devices. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems. 624:1--624:12.

Cited By

View all
  • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
  • (2023)Augmented Reality User’s Experience: AI-Based Data Collection, Processing and AnalysisAugmented Reality and Artificial Intelligence10.1007/978-3-031-27166-3_2(31-46)Online publication date: 30-Apr-2023
  • (2022)Robust Unsupervised Gaze Calibration Using Conversation and Manipulation Attention PriorsACM Transactions on Multimedia Computing, Communications, and Applications10.1145/347262218:1(1-27)Online publication date: 27-Jan-2022
  • Show More Cited By

Index Terms

  1. Reducing calibration drift in mobile eye trackers by exploiting mobile phone usage

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
      June 2019
      623 pages
      ISBN:9781450367097
      DOI:10.1145/3314111
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 June 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. eye tracker recalibration
      2. mobile eye tracking

      Qualifiers

      • Research-article

      Funding Sources

      • JST CREST

      Conference

      ETRA '19

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)18
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 12 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
      • (2023)Augmented Reality User’s Experience: AI-Based Data Collection, Processing and AnalysisAugmented Reality and Artificial Intelligence10.1007/978-3-031-27166-3_2(31-46)Online publication date: 30-Apr-2023
      • (2022)Robust Unsupervised Gaze Calibration Using Conversation and Manipulation Attention PriorsACM Transactions on Multimedia Computing, Communications, and Applications10.1145/347262218:1(1-27)Online publication date: 27-Jan-2022
      • (2021)Detecting Soldiers’ Fatigue Using Eye-Tracking Glasses: Practical Field Applications and Research OpportunitiesMilitary Medicine10.1093/milmed/usab509187:11-12(e1330-e1337)Online publication date: 14-Dec-2021
      • (2020)Ultracompact eye and pupil tracking device using VCSEL arrays and position sensitive detectorOptical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR)10.1117/12.2542593(50)Online publication date: 13-Mar-2020

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media