Skip to main content
Log in

The line of sight to estimate method based on stereo vision

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In order to get high-performance gaze direction, this paper proposes a three-dimensional view estimation method based on double cameras and double light source, to overcome the natural head movement under relatively large, estimated accuracy of sight. This paper presents a completely rely on spatial geometry calculations to achieve the user's line of sight placement calculation method. First calibrate the camera, and then use of calibrated cameras to the three-dimensional position of display screen and a light source for calibration; Due to the corneal curvature center and the pupil center on the same optical axis, so to determine the spatial orientation of the optical axis through the center of curvature of the cornea is calculated and imaginary pupil center; Second to determine the optical axis, the article does not directly estimation deviation angle of optical axis and optical axis Kappa, but by calibrate rotation matrix of a 3 * 3 M indirectly obtained the relationship between the optical axis and visual axis, to determine the visual axis direction; Finally through calculation to get the intersection of the visual axis and screen, That fixation point.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Aytekin M, Victor JD, Rucci M (2014) The visual input to the retina during natural head-free fixation. J Neurosci

  2. Beymer DJ, Flickner M (2003) Eye gaze tracking using an active stereo head. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 2

  3. Caligari M, Godia M, Guglielmettia (2013) Eye tracking communication devices in amyotrophic lateral sclerosis: impact on disability and quality of life. Amyotroph Lateral Scler Frontotemporal Degener 14(7–8)

  4. Carlos Hitoshi M, Mimica MRM (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24

    Article  Google Scholar 

  5. Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, London

    Book  MATH  Google Scholar 

  6. Gielen CCAM, Dijkstra TMH, Roozen IJ, Welten J (2009) Coordination of gaze and hand movements for tracking. Cortex 45:340–355

    Article  Google Scholar 

  7. Guestrin D, Eizenman M (2006) General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans Biomed Eng 53(6):1124–1133

    Article  Google Scholar 

  8. Guitton D (1998) Eye-head coordination in gaze control. Oxford University Press, Oxford, p 196

    Google Scholar 

  9. Holmqvist K, Nyström M, Andersson R, Dewhurst R, Jarodzka H, Van de Weijer J (2011) Eye tracking: a comprehensive guide to methods and measures. OUP Oxford

  10. Komogortsev O, Komogortsev O (2014) The application of eye movement biometrics in the automated detection of mild traumatic brain injury. Appl Expert Syst Med Sci

  11. Liu R, Zhou X, Wang N (2009) The use of eye movements in human-computer interaction. Image and Signal Processing, 2009. CISP’ 09. 2nd International Congress on, USA: IEEE, 1–4

  12. Lu F, Okabe T, Sugano Y, Sato Y (2014) Learning gaze biases with head motion for head pose-free gaze estimation. Image Vis Comput 32(3):169–179

    Article  Google Scholar 

  13. Mautz R, Cummings J, Reinicke B, Ricanek K (2015) A research survey application using eye tracking technology. Ann MS Comput Sci Inf Syst UNC Wilmington 9(1):7

    Google Scholar 

  14. Maybank SJ, Faugeras OD (1992) A theory of self-calibration of a moving camera. Int J Comput Vis 8(2):123–151

    Article  Google Scholar 

  15. Morimoto CH, Amir A, Flickner M (2002) Detecting eye position and gaze from a single camera and 2 light sources. Int Conf Pattern Recognit 314–317

  16. Morimoto C, Koons D, Amir A, Flickner M (2000) Pupil detection and tracking using multiple light sources. Image Vis Comput 18(4):331–336

    Article  Google Scholar 

  17. Robert J, Jacob K (1991) The use of eye movements in human-computer interaction. ACM Trans Inf Syst 9(3)

  18. Saito Y, Arai Y, Gao W (2009) Detection of three-axis angles by an optical sensor. Sens Actuators A Phys 150(2):175–183, 25 March

  19. Shih S, Liu J (2004) A novel approach to 3-D gaze tracking using stereo cameras. IEEE Trans Syst Man Cybern 34(1):234–45

    Article  Google Scholar 

  20. Shih S, Liu J (2004) A novel approach to 3-D gaze tracking using stereo cameras. IEEE Trans Syst Man Cybern (11)

  21. Stanislav P, Alzbeta B (2013) Eye-tracking study on different perception of 2D and 3D terrain visualisation. Cartogr J 50(3):240–246

    Article  Google Scholar 

  22. Ye Z, Li Y, Fathi A, Han Y, Rozga A (2012) Detecting eye contact using wearable eye-tracking glasses. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, (9)

  23. Yoshinobu E, Kiyotaka F (2013) Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using Two stereo-calibrated video cameras. IEEE Trans Biomed Eng 60(10):2952–60

    Article  Google Scholar 

  24. Zhang G, Chen J, Su G, Su Y (2014) An approach for pupil center location using facial symmetry. Biometric Recognition - 9th Chinese Conference, CCBR 2014, November 7–9, 2014. Proceedings

  25. Zhu Z, Ji Q (2004) Eye and gaze tracking for interactive graphic display. Mach Vis Appl 15(3):139–148

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wang Li.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Changyuan, W., Li, W. & Pengxiang, X. The line of sight to estimate method based on stereo vision. Multimed Tools Appl 75, 12123–12136 (2016). https://doi.org/10.1007/s11042-016-3283-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-3283-8

Keywords

Navigation