skip to main content
10.1145/3490100.3516467acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
Work in Progress

PARKS-Gaze - A Precision-focused Gaze Estimation Dataset in the Wild under Extreme Head Poses

Authors Info & Claims
Published:22 March 2022Publication History

ABSTRACT

The performance of appearance-based gaze estimation systems that utilizes machine learning depends on training datasets. Most of the existing gaze estimation datasets were recorded in laboratory conditions. The datasets recorded in the wild conditions display limited head pose and intra-person variation. We proposed PARKS-Gaze, a gaze estimation dataset with 570 minutes of video data from 18 participants. We captured head pose range of ± 50, [-40,60] degrees in yaw and pitch directions respectively. We captured multiple images for a single Point of Gaze (PoG) enabling to carry out precision analysis of gaze estimation models. Our cross-dataset experiments revealed that the model trained on proposed dataset obtained lower mean test errors than existing datasets, indicating its utility for developing real-world interactive gaze controlled applications.

References

  1. Tadas Baltrusaitis, Amir Zadeh, Yao Chong Lim, and Louis-Philippe Morency. 2018. Openface 2.0: Facial behavior analysis toolkit. In 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018). IEEE, 59–66.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Yihua Cheng, Xucong Zhang, Feng Lu, and Yoichi Sato. 2020. Gaze estimation by exploring two-eye asymmetry. IEEE Transactions on Image Processing 29 (2020), 5259–5272.Google ScholarGoogle ScholarCross RefCross Ref
  3. Tobias Fischer, Hyung Jin Chang, and Yiannis Demiris. 2018. Rt-gene: Real-time eye gaze estimation in natural environments. In Proceedings of the European Conference on Computer Vision (ECCV). 334–352.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Kenneth Alberto Funes Mora, Florent Monay, and Jean-Marc Odobez. 2014. Eyediap: A database for the development and evaluation of gaze estimation algorithms from rgb and rgb-d cameras. In Proceedings of the Symposium on Eye Tracking Research and Applications. 255–258.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Elias Daniel Guestrin and Moshe Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on biomedical engineering 53, 6 (2006), 1124–1133.Google ScholarGoogle ScholarCross RefCross Ref
  6. Petr Kellnhofer, Adria Recasens, Simon Stent, Wojciech Matusik, and Antonio Torralba. 2019. Gaze360: Physically unconstrained gaze estimation in the wild. In Proceedings of the IEEE International Conference on Computer Vision. 6912–6921.Google ScholarGoogle ScholarCross RefCross Ref
  7. Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2176–2184.Google ScholarGoogle ScholarCross RefCross Ref
  8. LRD Murthy, MD Babu, V Yelleti, KPS Saluja, and P Biswas. 2020. EYE-GAZE INTERFACE TO OPERATE AIRCRAFT DISPLAYS. (2020), 134–140. https://doi.org/10.5162/ettc2020/4.1Google ScholarGoogle ScholarCross RefCross Ref
  9. LRD Murthy and Pradipta Biswas. 2021. Appearance-based Gaze Estimation using Attention and Difference Mechanism. In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 3137–3146.Google ScholarGoogle Scholar
  10. LRD Murthy, Abhishek Mukhopadhyay, Varshit Yellheti, Somnath Arjun, Peter Thomas, M Dilli Babu, Kamal Preet Singh Saluja, DV JeevithaShree, and Pradipta Biswas. 2020. Evaluating Accuracy of Eye Gaze Controlled Interface in Military Aviation Environment. In 2020 IEEE Aerospace Conference. IEEE, 1–12.Google ScholarGoogle Scholar
  11. Gowdham Prabhakar, Aparna Ramakrishnan, Modiksha Madan, LRD Murthy, Vinay Krishna Sharma, Sachin Deshmukh, and Pradipta Biswas. 2020. Interactive gaze and finger controlled HUD for cars. Journal on Multimodal User Interfaces 14, 1 (2020), 101–121.Google ScholarGoogle ScholarCross RefCross Ref
  12. Vinay Krishna Sharma, LRD Murthy, KamalPreet Singh Saluja, Vimal Mollyn, Gourav Sharma, and Pradipta Biswas. 2020. Webcam controlled robotic arm for persons with SSMI. Technology and Disability 32, 3 (2020), 1–19.Google ScholarGoogle ScholarCross RefCross Ref
  13. Corten Clemente Singer and Björn Hartmann. 2019. See-Thru: Towards Minimally Obstructive Eye-Controlled Wheelchair Interfaces. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. 459–469.Google ScholarGoogle Scholar
  14. Brian A Smith, Qi Yin, Steven K Feiner, and Shree K Nayar. 2013. Gaze locking: passive eye contact detection for human-object interaction. In Proceedings of the 26th annual ACM symposium on User interface software and technology. 271–280.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Yusuke Sugano, Yasuyuki Matsushita, and Yoichi Sato. 2014. Learning-by-synthesis for appearance-based 3d gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 1821–1828.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Xucong Zhang, Seonwook Park, Thabo Beeler, Derek Bradley, Siyu Tang, and Otmar Hilliges. 2020. ETH-XGaze: A large scale dataset for gaze estimation under extreme head pose and gaze variation. In European Conference on Computer Vision. Springer, 365–381.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2018. Revisiting Data Normalization for Appearance-Based Gaze Estimation. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA) (2018-03-28). 12:1–12:9.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2019. Evaluation of appearance-based methods and implications for gaze-based applications. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It’s written all over your face: Full-face appearance-based gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 51–60.Google ScholarGoogle ScholarCross RefCross Ref
  20. Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. Mpiigaze: Real-world dataset and deep appearance-based gaze estimation. IEEE transactions on pattern analysis and machine intelligence 41, 1(2017), 162–175.Google ScholarGoogle Scholar

Index Terms

  1. PARKS-Gaze - A Precision-focused Gaze Estimation Dataset in the Wild under Extreme Head Poses
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          IUI '22 Companion: Companion Proceedings of the 27th International Conference on Intelligent User Interfaces
          March 2022
          142 pages
          ISBN:9781450391450
          DOI:10.1145/3490100

          Copyright © 2022 Owner/Author

          Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 22 March 2022

          Check for updates

          Qualifiers

          • Work in Progress
          • Research
          • Refereed limited

          Acceptance Rates

          Overall Acceptance Rate746of2,811submissions,27%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format