ABSTRACT
The performance of appearance-based gaze estimation systems that utilizes machine learning depends on training datasets. Most of the existing gaze estimation datasets were recorded in laboratory conditions. The datasets recorded in the wild conditions display limited head pose and intra-person variation. We proposed PARKS-Gaze, a gaze estimation dataset with 570 minutes of video data from 18 participants. We captured head pose range of ± 50, [-40,60] degrees in yaw and pitch directions respectively. We captured multiple images for a single Point of Gaze (PoG) enabling to carry out precision analysis of gaze estimation models. Our cross-dataset experiments revealed that the model trained on proposed dataset obtained lower mean test errors than existing datasets, indicating its utility for developing real-world interactive gaze controlled applications.
- Tadas Baltrusaitis, Amir Zadeh, Yao Chong Lim, and Louis-Philippe Morency. 2018. Openface 2.0: Facial behavior analysis toolkit. In 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018). IEEE, 59–66.Google ScholarDigital Library
- Yihua Cheng, Xucong Zhang, Feng Lu, and Yoichi Sato. 2020. Gaze estimation by exploring two-eye asymmetry. IEEE Transactions on Image Processing 29 (2020), 5259–5272.Google ScholarCross Ref
- Tobias Fischer, Hyung Jin Chang, and Yiannis Demiris. 2018. Rt-gene: Real-time eye gaze estimation in natural environments. In Proceedings of the European Conference on Computer Vision (ECCV). 334–352.Google ScholarDigital Library
- Kenneth Alberto Funes Mora, Florent Monay, and Jean-Marc Odobez. 2014. Eyediap: A database for the development and evaluation of gaze estimation algorithms from rgb and rgb-d cameras. In Proceedings of the Symposium on Eye Tracking Research and Applications. 255–258.Google ScholarDigital Library
- Elias Daniel Guestrin and Moshe Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on biomedical engineering 53, 6 (2006), 1124–1133.Google ScholarCross Ref
- Petr Kellnhofer, Adria Recasens, Simon Stent, Wojciech Matusik, and Antonio Torralba. 2019. Gaze360: Physically unconstrained gaze estimation in the wild. In Proceedings of the IEEE International Conference on Computer Vision. 6912–6921.Google ScholarCross Ref
- Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2176–2184.Google ScholarCross Ref
- LRD Murthy, MD Babu, V Yelleti, KPS Saluja, and P Biswas. 2020. EYE-GAZE INTERFACE TO OPERATE AIRCRAFT DISPLAYS. (2020), 134–140. https://doi.org/10.5162/ettc2020/4.1Google ScholarCross Ref
- LRD Murthy and Pradipta Biswas. 2021. Appearance-based Gaze Estimation using Attention and Difference Mechanism. In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 3137–3146.Google Scholar
- LRD Murthy, Abhishek Mukhopadhyay, Varshit Yellheti, Somnath Arjun, Peter Thomas, M Dilli Babu, Kamal Preet Singh Saluja, DV JeevithaShree, and Pradipta Biswas. 2020. Evaluating Accuracy of Eye Gaze Controlled Interface in Military Aviation Environment. In 2020 IEEE Aerospace Conference. IEEE, 1–12.Google Scholar
- Gowdham Prabhakar, Aparna Ramakrishnan, Modiksha Madan, LRD Murthy, Vinay Krishna Sharma, Sachin Deshmukh, and Pradipta Biswas. 2020. Interactive gaze and finger controlled HUD for cars. Journal on Multimodal User Interfaces 14, 1 (2020), 101–121.Google ScholarCross Ref
- Vinay Krishna Sharma, LRD Murthy, KamalPreet Singh Saluja, Vimal Mollyn, Gourav Sharma, and Pradipta Biswas. 2020. Webcam controlled robotic arm for persons with SSMI. Technology and Disability 32, 3 (2020), 1–19.Google ScholarCross Ref
- Corten Clemente Singer and Björn Hartmann. 2019. See-Thru: Towards Minimally Obstructive Eye-Controlled Wheelchair Interfaces. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. 459–469.Google Scholar
- Brian A Smith, Qi Yin, Steven K Feiner, and Shree K Nayar. 2013. Gaze locking: passive eye contact detection for human-object interaction. In Proceedings of the 26th annual ACM symposium on User interface software and technology. 271–280.Google ScholarDigital Library
- Yusuke Sugano, Yasuyuki Matsushita, and Yoichi Sato. 2014. Learning-by-synthesis for appearance-based 3d gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 1821–1828.Google ScholarDigital Library
- Xucong Zhang, Seonwook Park, Thabo Beeler, Derek Bradley, Siyu Tang, and Otmar Hilliges. 2020. ETH-XGaze: A large scale dataset for gaze estimation under extreme head pose and gaze variation. In European Conference on Computer Vision. Springer, 365–381.Google ScholarDigital Library
- Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2018. Revisiting Data Normalization for Appearance-Based Gaze Estimation. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA) (2018-03-28). 12:1–12:9.Google ScholarDigital Library
- Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2019. Evaluation of appearance-based methods and implications for gaze-based applications. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarDigital Library
- Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It’s written all over your face: Full-face appearance-based gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 51–60.Google ScholarCross Ref
- Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. Mpiigaze: Real-world dataset and deep appearance-based gaze estimation. IEEE transactions on pattern analysis and machine intelligence 41, 1(2017), 162–175.Google Scholar
Index Terms
- PARKS-Gaze - A Precision-focused Gaze Estimation Dataset in the Wild under Extreme Head Poses
Recommendations
Gaze from Head: Gaze Estimation Without Observing Eye
Pattern RecognitionAbstractWe propose a gaze estimation method not from eye observation but from head motion. This proposed method is based on physiological studies about the eye-head coordination, and the gaze direction is estimated from observation of head motion by using ...
Head or gaze?: controlling remote camera for hands-busy tasks in teleoperation: a comparison
OZCHI '10: Proceedings of the 22nd Conference of the Computer-Human Interaction Special Interest Group of Australia on Computer-Human InteractionHead motion and eye gaze are general models of natural human interaction. Recent computer vision based head tracking and eye tracking technologies have expanded the possibilities of designing and developing more natural and intuitive user interfaces for ...
ETH-XGaze: A Large Scale Dataset for Gaze Estimation Under Extreme Head Pose and Gaze Variation
Computer Vision – ECCV 2020AbstractGaze estimation is a fundamental task in many applications of computer vision, human computer interaction and robotics. Many state-of-the-art methods are trained and tested on custom datasets, making comparison across methods challenging. ...
Comments