Abstract
In recent years, research on human-computer interaction is becoming popular, most of which uses body movements, gestures or eye gaze direction. Until now, gazing estimation is still an active research domain. We propose an efficient method to solve the problem of the eye gaze point. We first locate the eye region by modifying the characteristics of the Active Appearance Model (AAM). Then by employing the Support Vector Machine (SVM), we estimate the five gazing directions through classification. The original 68 facial feature points in AAM are modified into 36 eye feature points. According to the two-dimensional coordinates of feature points, we classify different directions of eye gazing. The modified 36 feature points describe the contour of eyes, iris size, iris location, and the position of pupils. In addition, the resolution of cameras does not affect our method to determine the direction of line of sight accurately. The final results show the independence of classifications, less classification errors, and more accurate estimation of the gazing directions.
Similar content being viewed by others
References
“AAM Library”, http://code.google.com/p/aam-library/, referenced on May 1st, 2011
“AAM Tool”, http://personalpages.manchester.ac.uk/staff/timothy.f.cootes/software/am_tools_doc/index.html, referenced on May 1st, 2011
“Charge-coupled device”, http://en.wikipedia.org/wiki/Charge-coupled_device, referenced on May 1st, 2011.
“OpenCV”, http://www.opencv.org.cn/index.php/, referenced on May 1st, 2011
Bacivarov I, Ionita M, and Corcoran P (2008) Statistical models of appearance for Eye tracking and Eye-blink detection and measurement. IEEE Trans Consum Electron 54(3):1312–1320
Beymer D, Flickner M (2003) Eye gaze tracking using an active stereo head, IEEE Comput Soc Conf Comput Vis Pattern Recogn 2:451–458
Cadavid S, Mahoor MH, Messinger DS, and Cohn JF (2009) “Automated Classification of Gaze Direction Using Spectral Regression and Support Vector Machine,” International Conference on Affective Computing & Intelligent Interaction (ACII’09)
Carlos H Morimoto and Marcio RM Mimica (Apr. 2005) “Eye gaze tracking techniques for interactive applications,” Computer Vision and Image Understanding, Elsevier Science Incorporation, vol. 98, no. 1, pp.4–24
Chang C-C and Lin C-J (2001) “LIBSVM: a library for support vector machines,” Software available at http://www.csie.ntu.edu.tw/?cjlin/libsvm
Cootes TF, Edwards GJ, Taylor CJ (2001) Active appearance models. IEEE Trans Pattern Anal Mach Intell 23(6):681–685
Cootes TF, Taylor CJ, Cooper D, Graham J (1995) Active shape models-their training and application. Comp Vision Image Underst 61(1):38–59
Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297
Coutinho FL and Morimoto CH (2006) “Free head motion eye gaze tracking using a single camera and multiple light sources,” sibgrapi, pp.171–178, XIX Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI’06)
Cuong NH and Hoang HT (7–10 Dec. 2010) “Eye-gaze detection with a single WebCAM based on geometry features extraction,” Control Automation Robotics & Vision (ICARCV), vol., no., pp.2507–2512
Duchowski AT (2002) A breadth-first survey of Eye tracking applications. Behav Res Meth Instrum Comput 34(4):455–470
Ebisawa Y (1998) Improved video-based eye-gaze detection method. Instrum Meas 47(4):948–955
Edwards G, Cootes T and Taylor C (1998) “Face recognition using active appearance models,” Proceedings of the European Conference on Computer Vision, 2, pp. 581–695
Guestrin ED and Eizenman M (Jun. 2006) “General Theory of Remote Gaze Estimation Using the Pupil Center and Corneal Reflections,” IEEE Transactions on biomedical engineering, vol. 53, no. 6
Hansen DW, Ji Q (2010) In the Eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500
Huang S-C, Wu Y-L, Hung W-C and Tang C-Y (2010) “Point-of-Regard Measurement via Iris Contour with One Eye from Single Image,’ ism IEEE International Symposium on Multimedia, pp.336–341
Ishikawa T, Baker S, Matthews I, and Kanade T (Oct. 2004) “Passive driver gaze tracking with active appearance models,” In Proc. 11th World Congress on Intelligent Transportation Systems (WCITS’04)
Lee EC, Ko YJ and Park KR (Jul. 2009) “Gaze Tracking Based on Active Appearance Model and Multiple Support Vector Regression on Mobile Devices,” Journal of Optical Engineering, Vol. 48
Murray N and Roberts DJ (2006) “Comparison of head gaze and head and eye gaze within an immersive environment,” in Proc. DS-RT, pp.70–76
Ohtani M, Ebisawa Y (1995) Eye-gaze detection based on the pupil detection technique using two light sources and the image difference method. Eng Med Biol Soc 2:1623–1624
Takatani M, Ariki Y and Takiguchi T (Nov. 2010) “Gaze Estimation Using Regression Analysis and AAMs Parameters Selected Based on information criterion,” International Workshop on Gaze Sensing and Interactions in conjunction with ACCV2010, pp. 1–10
Viola P, Jones M (2001) Robust real-time object detection. Int J Comput Vis 57(2):137–154
Wang J-G, Sung E, and Venkateswarlu R (2005) Estimating the eye gaze from one eye. Comp Vision Image Underst 98(1):83–103
Zhao Y, Wang X, and Petriu EM (19–21 Sep. 2011) “Facial expression anlysis using eye gaze information,” Computational Intelligence for Measurement Systems and Applications (CIMSA), vol., no., pp.1–4
Zhu Z and Ji Q (Dec. 2007) “Novel Eye Gaze Tracking Techniques Under Natural Head Movement,” IEEE Transactions on biomedical engineering, vol. 54, no. 12
Zhu J and Yang J (2002) “Subpixel Eye Gaze Tracking,” fg, pp.0131, Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FG’02), pp.124–129
Acknowledgment
This work was partially supported by the National Science Council, Taiwan, under the Grants No. NSC101-2221-E-011-141, NSC100-2221-E-011-121, and NSC101-2221-E-211-011.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wu, YL., Yeh, CT., Hung, WC. et al. Gaze direction estimation using support vector machine with active appearance model. Multimed Tools Appl 70, 2037–2062 (2014). https://doi.org/10.1007/s11042-012-1220-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-012-1220-z