Abstract
Detecting and tracking human faces in video sequences is useful in a number of applications such as gesture recognition and human-machine interaction. In this paper, we show that online appearance models (holistic approaches) can be used for simultaneously tracking the head, the lips, the eyebrows, and the eyelids in monocular video sequences. Unlike previous approaches to eyelid tracking, we show that the online appearance models can be used for this purpose. Neither color information nor intensity edges are used by our proposed approach. More precisely, we show how the classical appearance-based trackers can be upgraded in order to deal with fast eyelid movements. The proposed eyelid tracking is made robust by avoiding eye feature extraction. Experiments on real videos show the usefulness of the proposed tracking schemes as well as their enhancement to our previous approach.
Similar content being viewed by others
Notes
The exact value depends on the camera-intrinsic parameters and the absolute depth.
References
Ahlberg, J.: Real-time facial feature tracking using an active model with fast image warping. In: International Workshop on Very Low Bitrate Video (VLBV). Athens, Greece (2001)
Ahlberg, J.: An active model for facial feature tracking. EURASIP J. Appl. Signal Process. 2002(6), 566–571 (2002)
Ahlberg, J: Model-based coding: extraction, coding, and evaluation of face model parameters. Ph.D. thesis, No. 761, Linköping University, Sweden (2002)
Cascia, M., Sclaroff, S., Athitsos, V.: Fast, reliable head tracking under varying illumination: an approach based on registration of texture-mapped 3D models. IEEE Trans. Pattern Anal. Mach. Intell. 22(4), 322–336 (2000)
Cootes, T., Edwards, G., Taylor, C.: Active appearance models. IEEE Trans. Pattern Anal. Mach. Intell. 23(6): 681–684 (2001)
Dornaika, F., Davoine, F.: On appearance based face and facial action tracking. IEEE Trans. Circuits Syst. Video Technol. 16(9):1107–1124 (2006)
Dornaika, F., Orozco, J., Gonzalez, J.: Combined head, lips, eyebrows, and eyelids tracking using adaptive appearance models. In: LNCS 4069. IV Conference on Articulated Motion and Deformable Objects, pp. 110–119 (2006)
Grauman, K., Betke, M., Gips, J., Bradski, G.R.: Communication via eye blinks: detection and duration analysis in real time. In: International Conference on Computer Vision and Pattern Recognition (2001)
Jepson, A., Fleet, D., El-Maraghi, T. Robust online appearance models for visual tracking. IEEE Trans. Pattern Anal. Mach. Intell. 25(10):1296–1311 (2003)
Lee, D.: Effective Gaussian mixture learning for video background subtraction. IEEE Trans. Pattern Anal. Mach. Intell. 27(5):827–832 (2005)
Liu, H., Wu, Y., Zha, H.: Eye states detection from color facial image sequence. In: SPIE international conference on image and graphics, vol. 4875, pp. 693–698 (2002)
Matthews, I., Baker, S.: Active appearance models revisited. Int. J. Comput. Vis. 60(2):135–164 (2004)
Moriyama, T., Kanade, T., Cohn, J., Xiao, J., Ambadar, Z., Gao, J., Imamura, H.: Automatic recognition of eye blinking in spontaneously occuring behavior. In: International Conference on Pattern Recognition (2002)
Sirohey, S., Rosenfeld, A., Duric, Z.: A method of detecting and tracking irises and eyelids in video. Pattern Recognit. 35(6):1389–1401 (2002)
Tan, H., Zhang, Y.J.: Detecting eye blink states by tracking iris and eyelids. Pattern Recognit. Lett. 27(6):667–675 (2006)
Tian, Y., Kanade, T., Cohn, J.F.: Dual-state parametric eye tracking. In: International Conference on Automatic Face and Gesture Recognition (2000)
Uzunova, V.I.: An eyelids and eye corners detection and tracking method for rapid iris tracking. Master’s thesis, University of Magdeburg, 2005
Zhou, S., Chellappa, R., Mogghaddam, B.: Visual tracking and recognition using appearance-adaptive models in particle filters. IEEE Trans. Image Process. 13(11):1473–1490 (2004)
Zhu, J., Yang, J.: Subpixel eye gaze tracking. In: International Conference on Automatic Face and Gesture Recognition (2002)
Acknowledgments
The authors thank Dr. Franck Davoine from CNRS, Compiegne, France, for providing the video sequence shown in Figure 10.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Dornaika, F., Orozco, J. Real time 3D face and facial feature tracking. J Real-Time Image Proc 2, 35–44 (2007). https://doi.org/10.1007/s11554-007-0032-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11554-007-0032-2