Skip to main content

Advertisement

Log in

A dataset for point of gaze detection using head poses and eye images

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

This paper presents a new, publicly available dataset, aimed to be used as a benchmark for Point of Gaze (PoG) detection algorithms. The dataset consists of two modalities that can be combined for PoG definition: (a) a set of videos recording the eye motion of human participants as they were looking at, or following, a set of predefined points of interest on a computer visual display unit (b) a sequence of 3D head poses synchronized with the video. The eye motion was recorded using a Mobile Eye-XG, head mounted, infrared monocular camera and the head position by using a set of Vicon motion capture cameras. The ground truth of the point of gaze and head location and direction in the three-dimensional space are provided together with the data. The ground truth regarding the point of gaze is known in advance since the participants are always looking at predefined targets on a monitor.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. http://www.vicon.com/.

  2. http://www.asleyetracking.com/.

  3. The dataset can be downloaded from:heracleia.uta.edu/eyetracking.

References

  1. Asteriadis S, Soufleros D, Karpouzis K, Kollias S (2009) A natural head pose and eye gaze dataset. In: Proceedings of the international workshop on affective-aware virtual agents and social robots-AFFINE ’09. ACM Press, New York, pp 1–4. doi:10.1145/1655260.1655261

  2. Baluja S, Pomerleau D (1993) Non-intrusive gaze tracking using artificial neural networks. In: Working notes: AAAI fall symposium series, machine learning in computer vision: what, why and how?

  3. Eggert D, Lorusso A, Fisher R (1997) Estimating 3-D rigid body transformations: a comparison of four major algorithms. Mach Vis Appl 9(5–6):272–290. doi:10.1007/s001380050048

    Article  Google Scholar 

  4. Guestrin ED, Eizenman M (2008) Remote point-of-gaze estimation requiring a single-point calibration for applications with infants. In: Proceedings of the 2008 symposium on eye tracking research & applications-ETRA ’08. ACM Press, New York, p 267. doi:10.1145/1344471.1344531

  5. Hadizadeh H, Enriquez MJ, Bajić IV (2012) Eye-tracking database for a set of standard video sequences. IEEE Trans Image Process 21(2):898–903. doi:10.1109/TIP.2011.2165292

    Article  MathSciNet  Google Scholar 

  6. Kohlbecher S, Bardinst S, Bartl K, Schneider E, Poitschke T, Ablassmeier M (2008) Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space. In: Proceedings of the 2008 symposium on eye tracking research & applications-ETRA ’08. ACM Press, New York, p 135. doi:10.1145/1344471.1344506

  7. Lee EC, Park KR (2008) A robust eye gaze tracking method based on a virtual eyeball model. Mach Vis Appl 20(5):319–337. doi:10.1007/s00138-008-0129-z

    Article  Google Scholar 

  8. Majaranta P, Aoki H, Donegan M, Hansen DW, Hansen JP, Hyrskykari A, Räihä KJ (2011) Gaze interaction and applications of eye tracking. IGI Glob. doi:10.4018/978-1-61350-098-9

  9. Matsumoto Y, Zelinsky A (2000) An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement. In: Proceedings fourth IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580). IEEE Comput Soc, pp 499–504. doi:10.1109/AFGR.2000.840680

  10. Morimoto CH, Mimica MR (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24. doi:10.1016/j.cviu.2004.07.010

    Article  Google Scholar 

  11. Nagamatsu T, Iwamoto Y, Kamahara J, Tanaka N, Yamamoto M (2010) Gaze estimation method based on an aspherical model of the cornea. In: Proceedings of the 2010 symposium on eye-tracking research & applications-ETRA ’10. ACM Press, New York, p 255: doi:10.1145/1743666.1743726

  12. Ohno T, Mukawa N, Yoshikawa A (2002) FreeGaze: a gaze tracking system for everyday gaze interaction. In: Proceedings of the symposium on eye tracking research & applications-ETRA ’02. ACM Press, New York, p 125. doi:10.1145/507072.507098

  13. Van Der Linde I, Rajashekar U, Bovik AC, Cormack LK (2009) DOVES: a database of visual eye movements. Spat Vis 22(2):161–177. doi:10.1163/156856809787465636

    Article  Google Scholar 

  14. Weidenbacher U, Layher G, Strauss PM, Neumann H (2007) A comprehensive head pose and gaze database. In: 3rd IET international conference on intelligent environments (IE 07) 2007(CP531):455–458. doi:10.1049/cp:20070407

  15. Winfield D, Parkhurst D (2005) Starburst: a hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05)-Workshops. IEEE, vol 3, pp 79–79. doi:10.1109/CVPR.2005.531

  16. Zhu Z, Ji Q (2007) Novel eye gaze tracking techniques under natural head movement. IEEE Trans Biomed Eng 54(12):2246–2260. doi:10.1109/TBME.2007.895750

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

This work is supported in part by the National Science Foundation under award numbers CNS 0923494 and CNS 1035913. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ilias Maglogiannis.

Additional information

This work is supported in part by the National Science Foundation under award numbers CNS 0923494 and CNS 1035913. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Rights and permissions

Reprints and permissions

About this article

Cite this article

McMurrough, C.D., Metsis, V., Kosmopoulos, D. et al. A dataset for point of gaze detection using head poses and eye images. J Multimodal User Interfaces 7, 207–215 (2013). https://doi.org/10.1007/s12193-013-0121-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-013-0121-4

Keywords

Navigation