Abstract:
This study presents an experiment highlighting how participants combine facial expressions and haptic feedback to perceive emotions when interacting with an expressive hu...Show MoreMetadata
Abstract:
This study presents an experiment highlighting how participants combine facial expressions and haptic feedback to perceive emotions when interacting with an expressive humanoid robot. Participants were asked to interact with the humanoid robot through a handshake behavior while looking at its facial expressions. Experimental data were examined within the information integration theory framework. Results revealed that participants combined Facial and Haptic cues additively to evaluate the Valence, Arousal, and Dominance dimensions. The relative importance of each modality was different across the emotional dimensions. Participants gave more importance to facial expressions when evaluating Valence. They gave more importance to haptic feedback when evaluating Arousal and Dominance.
Published in: 2015 International Conference on Affective Computing and Intelligent Interaction (ACII)
Date of Conference: 21-24 September 2015
Date Added to IEEE Xplore: 07 December 2015
ISBN Information:
Electronic ISSN: 2156-8111