skip to main content
10.1145/3450341.3457989acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Eye Tracking Calibration on Mobile Devices

Published: 25 May 2021 Publication History

Abstract

Eye tracking has been widely used in psychology, human-computer interaction and many other fields. Recently, eye tracking based on off-the-shelf cameras has produced promising results, compared to the traditional eye tracking devices. This presents an opportunity to introduce eye tracking on mobile devices. However, eye tracking on mobile devices face many challenges, including occlusion of faces and unstable and changing distance between face and camera. This research project aims to obtain stable and accurate calibration of front-camera based eye tracking in dynamic contexts through the construction of real-world eye-movement datasets, the introduction of novel context-awareness models and improved gaze estimation methods that can be adapted to partial faces.

References

[1]
Stephanie Brams, Ignace TC Hooge, Gal Ziv, Siska Dauwe, Ken Evens, Tony De Wolf, Oron Levin, Johan Wagemans, and Werner F Helsen. 2018. Does effective gaze behavior lead to enhanced performance in a complex error-detection cockpit task?PloS one 13, 11 (2018), e0207439. https://doi.org/10.1371/journal.pone.0207439
[2]
David R. Flatla, Carl Gutwin, Lennart E. Nacke, Scott Bateman, and Regan L. Mandryk. 2011. Calibration Games: Making Calibration Tasks Enjoyable by Adding Motivating Game Elements. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (Santa Barbara, California, USA) (UIST ’11). Association for Computing Machinery, New York, NY, USA, 403–412. https://doi.org/10.1145/2047196.2047248
[3]
Argenis Ramirez Gomez and Hans Gellersen. 2018. Smooth-i: Smart Re-Calibration Using Smooth Pursuit Eye Movements. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (Warsaw, Poland) (ETRA ’18). Association for Computing Machinery, New York, NY, USA, Article 10, 5 pages. https://doi.org/10.1145/3204493.3204585
[4]
B. Gong, Y. Shi, F. Sha, and K. Grauman. 2012. Geodesic Flow Kernel for Unsupervised Domain Adaptation. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)(CVPR ’12). IEEE Computer Society, USA, 2066–2073. https://doi.org/10.1109/CVPR.2012.6247911
[5]
Melvyn A Goodale and A David Milner. 1992. Separate visual pathways for perception and action. Trends in neurosciences 15, 1 (1992), 20–25. https://doi.org/10.1016/0166-2236(92)90344-8
[6]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford. https://doi.org/10.1080/17470218.2015.1098709
[7]
Michael Xuelin Huang and Andreas Bulling. 2019. SacCalib: reducing calibration distortion for stationary eye trackers using saccadic eye movements. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (Denver, Colorado) (ETRA ’19). Association for Computing Machinery, New York, NY, USA, Article 71, 10 pages. https://doi.org/10.1145/3317956.3321553
[8]
Michael Xuelin Huang, Jiajia Li, Grace Ngai, and Hong Va Leong. 2017a. Screenglint: Practical, in-situ gaze estimation on smartphones. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 2546–2557. https://doi.org/10.1145/3025453.3025794
[9]
Qiong Huang, Ashok Veeraraghavan, and Ashutosh Sabharwal. 2017b. TabletGaze: Dataset and Analysis for Unconstrained Appearance-Based Gaze Estimation in Mobile Tablets. Mach. Vision Appl. 28, 5–6 (Aug. 2017), 445–461. https://doi.org/10.1007/s00138-017-0852-4
[10]
Robert J.K. Jacob and Keith S. Karn. 2003. Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. In The Mind’s Eye. North-Holland, Amsterdam, 573–605. https://doi.org/10.1016/B978-044451020-4/50031-1
[11]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2018a. The Past, Present, and Future of Gaze-Enabled Handheld Mobile Devices: Survey and Lessons Learned(MobileHCI ’18). Association for Computing Machinery, New York, NY, USA, Article 38, 17 pages. https://doi.org/10.1145/3229434.3229452
[12]
Mohamed Khamis, Anita Baier, Niels Henze, Florian Alt, and Andreas Bulling. 2018b. Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones Used in the Wild. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3173854
[13]
Davis E. King. 2009. Dlib-Ml: A Machine Learning Toolkit. Journal of Machine Learning Research 10 (Dec. 2009), 1755–1758. http://jmlr.org/papers/v10/king09a.html
[14]
Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2176–2184. https://doi.org/10.1109/CVPR.2016.239
[15]
Christian Lander, Frederic Kerber, Thorsten Rauber, and Antonio Krüger. 2016. A Time-Efficient Re-Calibration Algorithm for Improved Long-Term Accuracy of Head-Worn Eye Trackers. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (Charleston, South Carolina) (ETRA ’16). Association for Computing Machinery, New York, NY, USA, 213–216. https://doi.org/10.1145/2857491.2857513
[16]
Mingsheng Long, Yue Cao, Jianmin Wang, and Michael I. Jordan. 2015. Learning Transferable Features with Deep Adaptation Networks. In Proceedings of the 32nd International Conference on International Conference on Machine Learning - Volume 37(Lille, France) (ICML’15). JMLR.org, 97–105. http://proceedings.mlr.press/v37/long15.html
[17]
Jacob L. Orquin and Simone Mueller Loose. 2013. Attention and choice: A review on eye movements in decision making. Acta Psychologica 144, 1 (2013), 190–206. https://doi.org/10.1016/j.actpsy.2013.06.003
[18]
S. J. Pan, I. W. Tsang, J. T. Kwok, and Q. Yang. 2011. Domain Adaptation via Transfer Component Analysis. IEEE Transactions on Neural Networks 22, 2, 199–210. https://doi.org/10.1109/TNN.2010.2091281
[19]
Argenis Ramirez-Gomez and Hans Gellersen. 2017. GazeBall: Leveraging Natural Gaze Behavior for Continuous Re-calibration in Gameplay. Journal of Eye Movement Research 10, 6 (2017). http://cogain2017.cogain.org/camready/talk7-RamirezGomez.pdf
[20]
Ronal Singh, Tim Miller, Joshua Newn, Eduardo Velloso, Frank Vetere, and Liz Sonenberg. 2020. Combining gaze and AI planning for online human intention recognition. Artificial Intelligence 284 (2020), 103275. https://doi.org/10.1016/j.artint.2020.103275
[21]
Baochen Sun and Kate Saenko. 2016. Deep CORAL: Correlation Alignment for Deep Domain Adaptation. In Computer Vision – ECCV 2016 Workshops, Gang Hua and Hervé Jégou (Eds.). Springer International Publishing, Cham, 443–450. https://doi.org/10.1007/978-3-319-49409-8_35
[22]
Tony Tien, Philip H Pucher, Mikael H Sodergren, Kumuthan Sriskandarajah, Guang-Zhong Yang, and Ara Darzi. 2014. Eye tracking for skills assessment and training: a systematic review. journal of surgical research 191, 1 (2014), 169–178. https://doi.org/10.1016/j.jss.2014.04.032
[23]
Subarna Tripathi and Brian Guenter. 2017. A statistical approach to continuous self-calibrating eye gaze tracking for head-mounted virtual reality systems. In 2017 IEEE winter conference on applications of computer vision (WACV). IEEE, 862–870. https://doi.org/10.1109/WACV.2017.101
[24]
Nachiappan Valliappan, Na Dai, Ethan Steinberg, Junfeng He, Kantwon Rogers, Venky Ramachandran, Pingmei Xu, Mina Shojaeizadeh, Li Guo, Kai Kohlhoff, 2020. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature communications 11, 1 (2020), 1–12. https://doi.org/10.1038/s41467-020-18360-5
[25]
Yoram Wurmser. 2019. US Time Spent with Mobile 2019. Retrieved March 1, 2021 from https://www.emarketer.com/content/us-time-spent-with-mobile-2019
[26]
Matthew D Zeiler and Rob Fergus. 2014. Visualizing and understanding convolutional networks. In European conference on computer vision. Springer, 818–833. https://doi.org/10.1007/978-3-319-10590-1_53
[27]
Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2019. Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300646
[28]
Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It’s written all over your face: Full-face appearance-based gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 51–60. https://doi.org/10.1109/CVPRW.2017.284
[29]
Meilu Zhu, Daming Shi, Mingjie Zheng, and Muhammad Sadiq. 2019. Robust facial landmark detection via occlusion-adaptive deep networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 3486–3496. https://doi.org/10.1109/CVPR.2019.00360

Cited By

View all
  • (2023)Development of an Appearance-based Eye Tracking System with Convolutional Neural Network Integrated in a Learning Management Application2023 IEEE 15th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM)10.1109/HNICEM60674.2023.10589191(1-6)Online publication date: 19-Nov-2023
  • (2023)Touchless and nonverbal human-robot interfaces: An overview of the state-of-the-artSmart Health10.1016/j.smhl.2022.10036527(100365)Online publication date: Mar-2023
  1. Eye Tracking Calibration on Mobile Devices

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '21 Adjunct: ACM Symposium on Eye Tracking Research and Applications
    May 2021
    78 pages
    ISBN:9781450383578
    DOI:10.1145/3450341
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 May 2021

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. context awareness
    2. eye tracking
    3. gaze estimation

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Conference

    ETRA '21
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)38
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 10 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Development of an Appearance-based Eye Tracking System with Convolutional Neural Network Integrated in a Learning Management Application2023 IEEE 15th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM)10.1109/HNICEM60674.2023.10589191(1-6)Online publication date: 19-Nov-2023
    • (2023)Touchless and nonverbal human-robot interfaces: An overview of the state-of-the-artSmart Health10.1016/j.smhl.2022.10036527(100365)Online publication date: Mar-2023

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media