Abstract
The current work will describe an approach to synthesize expressions, including intermediate ones, via the tools provided in the MPEG-4 standard based on real measurements and on universally accepted assumptions of their meaning, taking into account results of Whissel’s study. Additionally, MPEG-4 facial animation parameters are used in order to evaluate theoretical predictions for intermediate expressions of a given emotion episode, based on Scherer’s appraisal theory. MPEG-4 FAPs and action units are combined in modeling the effects of appraisal checks on facial expressions and temporal evolution issues of facial expressions are investigated. The results of the synthesizing process can then be applied to Embodied Conversational Agents (ECAs), rendering their interaction with humans, or other ECAs, more affective.
Similar content being viewed by others
References
Picard RW (1997) Affective computing. MIT Press, Cambridge
Wehrle T, Kaiser S, Schmidt S, Scherer KR (2000) Studying the dynamics of emotional expression using synthesized facial muscle movements. J Pers Soc Psychol 78(1):105–119
Roseman IJ, Smith CA (2001) Appraisal theory: overview, assumptions, varieties, controversies. In: Scherer KR, Schorr A, Johnstone T (eds) Appraisal processes in emotion: theory methods, research. Oxford University Press, Oxford, pp 3–19
Ortony A, Clore GL, Collins A (1988) The cognitive structure of emotions. Cambridge University Press, Cambridge
Pegna AJ, Khateb A, Lazeyras F, Seghier ML (2004) Discriminating emotional faces without primary visual cortices involves the right amygdala. Nat Neurosci 8(1):24–25
Scherer KR (2001) Appraisal considered as a process of multilevel sequential checking. In: Scherer KR, Schorr A, Johnstone T (eds) Appraisal processes in emotion: theory methods, research. Oxford University Press, Oxford, pp 92–129
Scherer KR (1984) On the nature and function of emotion: a component process approach. In: Scherer KR, Ekman P (eds) Approaches to emotion. Lawrence Erlbaum Associates, Hillsdale, pp 293–318
Tekalp M, Ostermann J (2000) Face and 2-D mesh animation in MPEG-4. Image Commun J 15(4–5):387–421
Ekman P (1993) Facial expression and emotion. Am Psychol 48:384–392
Raouzaiou A, Tsapatsoulis N, Karpouzis K, Kollias S (2002) Parameterized facial expression synthesis based on MPEG-4. EURASIP J Appl Signal Process 2002(10):1021–1038
Raouzaiou A, Caridakis G, Malatesta L, Karpouzis K, Grandjean D, Burkhardt F, Kollias S (2007) Emotion theory and multimodal synthesis of affective ECAs. In: Pelachaud C, Cañamero L (eds) Submitted for publication to Achieving Human-Like Qualities in Interactive Virtual and Physical Humanoids. Special Issue of the International Journal of Humanoid Robotics
Whissel CM (1989) The dictionary of affect in language. In: Plutchnik R, Kellerman H (eds) Emotion: theory, research and experience, vol 4. The measurement of emotions. Academic, New York
Acknowledgments
This research is partly supported by the EC Project HUMAINE (IST-507422).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Malatesta, L., Raouzaiou, A., Karpouzis, K. et al. MPEG-4 facial expression synthesis. Pers Ubiquit Comput 13, 77–83 (2009). https://doi.org/10.1007/s00779-007-0164-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-007-0164-1