Abstract
Intelligent interaction with an environment, other IVAs, and human users requires a system that identifies subtle expressive cues and behaves naturally using modalities such as body, face, and voice to communicate. Although research on individual affective channels has increased, little is known about expressive qualities in whole body movement. This study has three goals: (1) to determine rates of observer recognition of emotion in walking, (2) to use kinematic analysis to quantify how emotions change gait patterns in characteristic ways, and (3) to describe the concurrence of facial and bodily expression of emotion.
Twenty-six undergraduate students recalled an experience from their own lives in which they felt angry, sad, content, joy, or no emotion at all (neutral). After recalling a target emotion, participants walked across the lab. Whole body motion capture data were acquired using a video-based, 6-camera system. Side view video was also recorded. Ten participants wore a special head mounted camera designed to record video of facial expression. After each trial, participants rated the intensity of eight emotions (4 target and 4 non-target). After blurring the faces in the side view video so that facial expressions were not observable, randomized composite videos were shown to untrained observers. After viewing each video clip, observers selected one of ten responses corresponding to the emotion that they thought the walker felt during the trial. FACS coding was used to evaluate the face video for evidence of emotion and timing of facial expressions with respect to the gait cycle.
Self-report data indicated that the walkers felt the target emotions at levels corresponding to “moderately” or above in all trials. Validation data were collected from five observers on gait trials from a subset of subjects (n=16). Recognition rates for sad, anger, neutral and content were 45%, 25%, 20% and 16%, respectively. Joy was recognized at chance levels (10%). Normalized velocity, normalized stride length, cycle duration and velocity were significantly affected by emotion.
This study is unique in describing the effects of specific emotions on gait. The preliminary results indicate that gait kinematics change with emotion. Although temporal-spatial kinematics were related to arousal levels, angular kinematics are needed to distinguish emotions with similar levels of arousal.
Similar content being viewed by others
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Crane, E.A., Gross, M.M., Fredrickson, B.L. (2006). Expression of Emotion in Body and Face. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds) Intelligent Virtual Agents. IVA 2006. Lecture Notes in Computer Science(), vol 4133. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11821830_41
Download citation
DOI: https://doi.org/10.1007/11821830_41
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37593-7
Online ISBN: 978-3-540-37594-4
eBook Packages: Computer ScienceComputer Science (R0)