Abstract
This article presents a feature-based framework to automatically track 18 facial landmarks for emotion recognition and emotional dynamic analysis. With a new way of using multi-kernel learning, we combine two methods: the first matches facial feature points between consecutive images and the second uses an offline learning of the facial landmark appearance. Matching points results in a jitter-free tracking and the offline learning prevents the tracking framework from drifting. We train the tracking system on the Cohn-Kanade database and analyze the dynamic of emotions and Action Units on the MMI database sequences. We perform accurate detection of facial expressions temporal segment and report experimental results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Cootes, T.F., Edwards, G.J., Taylor, C.J.: Active appearance models. In: Burkhardt, H., Neumann, B. (eds.) ECCV 1998. LNCS, vol. 1407, p. 484. Springer, Heidelberg (1998)
Al Haj, M., Orozco, J., Gonzalez, J., Villanueva, J.: Automatic face and facial features initialization for robust and accurate tracking. In: ICPR 2008, pp. 1–4 (2008)
Zhou, M., Liang, L., Sun, J., Wang, Y., Beijing, C.: Aam based face tracking with temporal matching and face segmentation. In: CVPR 2010, pp. 701–708 (2010)
Blanz, V., Vetter, T.: Face recognition based on fitting a 3d morphable model. In: PAMI (2003)
Cristinacce, D., Cootes, T.: Feature detection and tracking with constrained local models. In: BMVC 2006, pp. 929–938 (2006)
Lucas, B., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: IJCAI 1981, vol. 3, pp. 674–679 (1981)
Zhu, Z., Ji, Q., Fujimura, K., Lee, K.: Combining kalman filtering and mean shift for eye tracking under active ir illumination. Pattern Recognition 4 (2002)
Tian, Y., Kanade, T., Cohn, J.: Dual-state parametric eye track. In: FG 2000 (2000)
Tian, Y., Kanade, T., Cohn, J.: Recognizing upper face action units for facial expression analysis. In: CVPR 2000, pp. 294–301 (2000)
Kanade, T., Tian, Y., Cohn, J.: Comprehensive database for facial expression analysis. In: FG 2000, p. 46 (2000)
Pantic, M., Valstar, M., Rademaker, R., Maat, L.: Web-based database for facial expression analysis. In: ICME 2005, p. 5 (2005)
Scholkopf, B., Smola, A.J.: Learning with kernels. MIT Press, Cambridge (2002)
Lanckriet, G.R.G., Cristianini, N., Bartlett, P., Ghaoui, L.E., Jordan, M.I.: Learning the kernel matrix with semidefinite programming. JMLRÂ 5, 27 (2004)
Rakotomamonjy, A., Bach, F., Canu, S., Grandvalet, Y.: Simplemkl. JMLR (2008)
Rapp, V., Senechal, T., Bailly, K., Prevost, L.: Multiple kernel learning svm and statistical validation for facial landmark detection. In: FG 2011 (to appear, 2011)
Valstar, M., Pantic, M.: Fully automatic facial action unit detection and temporal analysis. In: CVPRW 2006, p. 149. IEEE, Los Alamitos (2006)
Koelstra, S., Pantic, M., Patras, I.: A dynamic texture based approach to recognition of facial actions and their temporal models. In: PAMI (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Senechal, T., Rapp, V., Prevost, L. (2011). Facial Feature Tracking for Emotional Dynamic Analysis. In: Blanc-Talon, J., Kleihorst, R., Philips, W., Popescu, D., Scheunders, P. (eds) Advanced Concepts for Intelligent Vision Systems. ACIVS 2011. Lecture Notes in Computer Science, vol 6915. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23687-7_45
Download citation
DOI: https://doi.org/10.1007/978-3-642-23687-7_45
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-23686-0
Online ISBN: 978-3-642-23687-7
eBook Packages: Computer ScienceComputer Science (R0)