skip to main content
10.1145/2628257.2628267acmconferencesArticle/Chapter ViewAbstractPublication PagessapConference Proceedingsconference-collections
research-article

Assessing naturalness and emotional intensity: a perceptual study of animated facial motion

Published: 08 August 2014 Publication History

Abstract

Animated characters appear in applications for entertainment, education, and therapy. When these characters display appropriate emotions for their context, they can be particularly effective. Characters can display emotions by accurately mimicking the facial expressions and vocal cues that people display or by damping or exaggerating the emotionality of the expressions. In this work, we explored which of these strategies would be most effective for animated characters. We investigated the effects of altering the auditory and facial levels of expressiveness on emotion recognition accuracy and ratings of perceived emotional intensity and naturalness. We ran an experiment with emotion (angry, happy, sad), auditory emotion level (low, high), and facial motion magnitude (damped, unaltered, exaggerated) as within-subjects factors. Participants evaluated animations of a character whose facial motion matched that of an actress we tracked using an active appearance model. This method of tracking and animation can capture subtle facial motions in real-time, a necessity for many interactive animated characters. We manipulated auditory emotion level by asking the actress to speak sentences at varying levels, and we manipulated facial motion magnitude by exaggerating and damping the actress's spatial motion. We found that the magnitude of auditory expressiveness was positively related to emotion recognition accuracy and ratings of emotional intensity. The magnitude of facial motion was positively related to ratings of emotional intensity but negatively related to ratings of naturalness.

References

[1]
Ambadar, Z., Schooler, J. W., and Cohen, J. F. 2005. Deciphering the enigmatic face: The importance of facial dynamics in interpreting subtle facial expressions. Psychological Science 16, 5, 403--410.
[2]
Bartneck, C., and Reichenbach, J. 2005. Subtle emotional expressions of synthetic characters. International Journal of Human-Computer Studies 62, 2, 179--192.
[3]
Bassili, J. N. 1978. Facial motion in the perception of faces and of emotional expression. Journal of Experimental Psychology: Human Perception and Performance 4, 3, 373--379.
[4]
Bassili, J. N. 1979. Emotion recognition: The role of facial movement and the relative importance of upper and lower areas of the face. Journal of Personality and Social Psychology 37, 11, 2049--2058.
[5]
Beale, R., and Creed, C. 2009. Affective interaction: How emotional agents affect users. International Journal of Human-Computer Studies 67, 9, 755--776.
[6]
Boker, S. M., Cohn, J. F., Theobald, B.-J., Matthews, I., Brick, T. R., and Spies, J. R. 2009. Effects of damping head movement and facial expression in dyadic conversation using real-time facial expression tracking and synthesized avatars. Philosophical Transactions of the Royal Society B 364, 1535, 3485--3495.
[7]
Bradley, M. M., Greenwald, M. K., Petry, M. C., and Lang, P. J. 1992. Remembering pictures: Pleasure and arousal in memory. Journal of Experimental Psychology: Learning, Memory, and Cognition 18, 2, 379--390.
[8]
Calder, A. J., Young, A. W., Rowland, D., and Perrett, D. I. 1997. Computer-enhanced emotion in facial expressions. Philosophical Transactions of the Royal Society B 264, 919--925.
[9]
Calder, A. J., Rowland, D., Young, A. W., Nimmo-Smith, I., Keane, J., and Perrett, D. I. 2000. Caricaturing facial expressions. Cognition 76, 2, 105--146.
[10]
Cootes, T. F., Edwards, G. J., and Taylor, C. J. 2001. Active appearance models. IEEE Transactions on Pattern Analysis and Machine Intelligence 23, 6, 681--685.
[11]
Cootes, T. F., Wheeler, G., Walker, K., and Taylor, C. J. 2002. View-based active appearance models. Image and Vision Computing 20, 9--10, 657--664.
[12]
Cunningham, D. W., and Wallraven, C. 2009. Dynamic information for the recognition of conversational expressions. Journal of Vision 9, 13, 1--17.
[13]
de Gelder, B., and Vroomen, J. 2000. The perception of emotions by ear and by eye. Cognition and Emotion 14, 3, 289--311.
[14]
Ekman, P., and Friesen, W. V. 1978. Facial Action Coding System: Investigator's Guide. Consulting Psychologists Press, Palo Alto, CA.
[15]
Hess, U., Blairy, S., and Kleck, R. E. 1997. The intensity of emotional facial expressions and decoding accuracy. Journal of Nonverbal Behavior 21, 4, 241--257.
[16]
Hill, H. C., Troje, N. F., and Johnston, A. 2005. Rangeand domain-specific exaggeration of facial speech. Journal of Vision 5, 10, 793--807.
[17]
Hodgkinson, G. 2009. The seduction of realism. In Proc. of ACM SIGGRAPH ASIA 2009 Educators Program, 1--4.
[18]
Hyde, J., Carter, E. J., Kiesler, S., and Hodgins, J. K. 2013. Perceptual effects of damped and exaggerated facial motion in animated characters. In Proc. of IEEE Automatic Face and Gesture Recognition 2013, 1--6.
[19]
Kleiner, M., Brainard, D., and Pelli, D. 2007. What's new in Psychtoolbox-3? In Perception, vol. 36, ECVP Abstract Supplement.
[20]
Lasseter, J. 1987. Principles of traditional animation applied to 3D computer animation. In Proc. of ACM SIGGRAPH 1987, 35--44.
[21]
Massaro, D. W., and Egan, P. B. 1996. Perceiving affect from the voice and the face. Psychonomic Bulletin & Review 3, 2, 215--221.
[22]
Matthews, I., and Baker, S. 2004. Active appearance models revisited. International Journal of Computer Vision 60, 135--164.
[23]
Nass, C., and Moon, Y. 2000. Machines and mindlessness: Social responses to computers. Journal of Social Issues 56, 1, 81--103.
[24]
Pollick, F. E., Hill, H., Calder, A., and Paterson, H. 2003. Recognising facial expression from spatially and temporally modified movements. Perception 32, 813--826.
[25]
Russell, J. A. 1980. A circumplex model of affect. Journal of Personality and Social Psychology 39, 6, 1161--1178.
[26]
Theobald, B.-J., Matthews, I., Mangini, M., Spies, J. R., Brick, T. R., Cohn, J. F., and Boker, S. M. 2009. Mapping and manipulating facial expression. Language and Speech 52, 369--386.
[27]
Thomas, F., and Johnston, O. 1981. Disney Animation: The Illusion of Life. Abbeville Press.
[28]
Tinwell, A., Grimshaw, M., Nabi, D. A., and Williams, A. 2011. Facial expression of emotion and perceptions of the Uncanny Valley in virtual characters. Computers in Human Behavior, 2, 741--749.
[29]
Wallbott, H. G., and Scherer, K. R. 1986. Cues and channels in emotion recognition. Journal of Personality and Social Psychology 51, 4, 690--699.
[30]
Watson, D., and Tellegan, A. 1985. Toward a consensual structure of mood. Psychological Bulletin 98, 219--235.

Cited By

View all
  • (2025)Micro and macro facial expressions by driven animations in realistic Virtual HumansEntertainment Computing10.1016/j.entcom.2024.10085352(100853)Online publication date: Jan-2025
  • (2024)The Effect of Dynamic Facial Asymmetries on the Perceived Believability, Appeal, and Naturalness of Animated AgentsACM Symposium on Applied Perception 202410.1145/3675231.3675246(1-8)Online publication date: 30-Aug-2024
  • (2023)Revisiting Micro and Macro Expressions in Computer Graphics CharactersProceedings of the 22nd Brazilian Symposium on Games and Digital Entertainment10.1145/3631085.3631228(38-45)Online publication date: 6-Nov-2023
  • Show More Cited By

Index Terms

  1. Assessing naturalness and emotional intensity: a perceptual study of animated facial motion

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      SAP '14: Proceedings of the ACM Symposium on Applied Perception
      August 2014
      137 pages
      ISBN:9781450330091
      DOI:10.1145/2628257
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 08 August 2014

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. animation
      2. audiovisual perception
      3. emotion
      4. face

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      SAP '14
      Sponsor:
      SAP '14: ACM Symposium on Applied Perception 2014
      August 8 - 9, 2014
      British Columbia, Vancouver, Canada

      Acceptance Rates

      Overall Acceptance Rate 43 of 94 submissions, 46%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)24
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 27 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)Micro and macro facial expressions by driven animations in realistic Virtual HumansEntertainment Computing10.1016/j.entcom.2024.10085352(100853)Online publication date: Jan-2025
      • (2024)The Effect of Dynamic Facial Asymmetries on the Perceived Believability, Appeal, and Naturalness of Animated AgentsACM Symposium on Applied Perception 202410.1145/3675231.3675246(1-8)Online publication date: 30-Aug-2024
      • (2023)Revisiting Micro and Macro Expressions in Computer Graphics CharactersProceedings of the 22nd Brazilian Symposium on Games and Digital Entertainment10.1145/3631085.3631228(38-45)Online publication date: 6-Nov-2023
      • (2023)Multi-source transfer learning for facial emotion recognition using multivariate correlation analysisScientific Reports10.1038/s41598-023-48250-x13:1Online publication date: 28-Nov-2023
      • (2022)Facial Feature Study of Cartoon and Real People with the Aid of Artificial IntelligenceSustainability10.3390/su14201346814:20(13468)Online publication date: 19-Oct-2022
      • (2022)Perceived Naturalness of Interpolation Methods for Character Upper Body AnimationAdvances in Visual Computing10.1007/978-3-030-90439-5_9(103-115)Online publication date: 1-Jan-2022
      • (2019)Web-Based Embodied Conversational Agents and Older PeoplePerspectives on Human-Computer Interaction Research with Older People10.1007/978-3-030-06076-3_8(119-135)Online publication date: 21-Feb-2019
      • (2018)Seeing More Than Human: Autism and Anthropomorphic Theory of MindFrontiers in Psychology10.3389/fpsyg.2018.005289Online publication date: 17-Apr-2018
      • (2017)Design of a Knowledge-Based Agent as a Social CompanionProcedia Computer Science10.1016/j.procs.2017.11.119121:C(920-926)Online publication date: 1-Jan-2017
      • (2017)KRISTINA: A Knowledge-Based Virtual Conversation AgentAdvances in Practical Applications of Cyber-Physical Multi-Agent Systems: The PAAMS Collection10.1007/978-3-319-59930-4_23(284-295)Online publication date: 3-Jun-2017
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media