Abstract
This paper presents a facial animation system using real-time tracking of 3D facial motions from a depth camera. We first applied 2D facial motion tracking based on extended Active Shape Models (ASMs) from 2D texture image corresponding to captured 3D depth information. Based on the estimated feature point tracking from extended 2D facial motion tracking, 3D facial motions are estimated. From the estimated 3D facial motion using extended ASMs, we extract MPEG-4 facial animation parameters (FAPs) from the 3D facial motion tracking, which provides more accurate estimation of FAPs invariant to view variations. Facial animations can be achieved by any facial animation tools supporting facial animation based on FAPs.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Cootes, T.F., Taylor, C.J., Cooper, D.H., Graham, J.: Active shape models: Their training and applications, vol. 61(1), pp. 38–59 (1995)
Deng, Z., Neumann, U. (eds.): Data-Driven 3D Facial Animation. Springer, London (2008)
Ofli, F., Erzin, E., Yemez, Y., Tekalp, A.M.: Estimation and analysis of facial animation parameter patterns, vol. (IV), pp. 293–296 (2007)
Pandzie, I.S., Forchheimer, R. (eds.): MPEG-4 Facial Animation: The Standard, Implementation and Applications. John Wiley & Sons, LTD (2002)
Sarris, N., Grammalidis, N., Strintzis, M.G.: Fap extraction using three-dimensional motion estimatioin. IEEE Transactions on Circuits and Systems for Video Technology 12(10) (2002)
Savran, A., Alyüz, N., Dibeklioğlu, H., Çeliktutan, O., Gökberk, B., Sankur, B., Akarun, L.: Bosphorus Database for 3D Face Analysis. In: Schouten, B., Juul, N.C., Drygajlo, A., Tistarelli, M. (eds.) BIOID 2008. LNCS, vol. 5372, pp. 47–56. Springer, Heidelberg (2008), http://dx.doi.org/10.1007/978-3-540-89991-4_6
Savran, A., Sankur, B., Bilge, T.: Comparative evaluation of 3d vs. 2d modality for automatic detection of facial action units. Pattern Recognition 45, 767–782 (2012)
Tao, J., Tan, T.T.: Affective Computing: A Review. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 981–995. Springer, Heidelberg (2005)
Tong, Y., Liao, W., Ji, Q.: Facial action unit recognition by exploiting their dynamic and semantic relationships. IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI)
Tsalakanidou, F., Malassiotis, S.: Real-time 2d+3d facial action and expression recognition. Pattern Recognition 43(5), 1763–1775 (2010)
Vlasic, D., Brand, M., Pfister, H., Popović, J.: Face transfer with multilinear models. ACM Trans. Graph. 24, 426–433 (2005), http://doi.acm.org/10.1145/1073204.1073209
Yin, L., Wei, X., Sun, Y., Wang, J., Rosato, M.J.: A 3d facial expression database for facial behavior research. In: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition, FGR 2006, pp. 211–216. IEEE Computer Society, Washington, DC (2006), http://dx.doi.org/10.1109/FGR.2006.6
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lee, CS., Chun, S., Lee, SH. (2011). Facial Animation and Analysis Using 2D+3D Facial Motion Tracking. In: Kim, Th., et al. Multimedia, Computer Graphics and Broadcasting. MulGraB 2011. Communications in Computer and Information Science, vol 262. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27204-2_33
Download citation
DOI: https://doi.org/10.1007/978-3-642-27204-2_33
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-27203-5
Online ISBN: 978-3-642-27204-2
eBook Packages: Computer ScienceComputer Science (R0)