ABSTRACT
In order for robots to be socially accepted and generate empathy they must display emotions. For robots such as Nao, body language is the best medium available, as they do not have the ability to display facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should greatly improve its acceptance.
This research investigates the creation of an "Affect Space" [1] for the generation of emotional body language that could be displayed by robots. An Affect Space is generated by "blending" (i.e. interpolating between) different emotional expressions to create new ones. An Affect Space for body language based on the Circumplex Model of emotions [2] has been created.
The experiment reported in this paper investigated the perception of specific key poses from the Affect Space. The results suggest that this Affect Space for body expressions can be used to improve the expressiveness of humanoid robots.
In addition, early results of a pilot study are described. It revealed that the context helps human subjects improve their recognition rate during a human-robot imitation game, and in turn this recognition leads to better outcome of the interactions.
- Breazal, C., Designing sociable robots. Intelligent Robotics & Autonomous Agents. 2002: MIT press. Google ScholarDigital Library
- Russell, J.A., A circumplex model of affect. Journal of Personality and Social Psychology, 1980. 39: p. 1161--1178.Google Scholar
- Aldebaran, http://www.aldebaran-robotics.com/. 2010.Google Scholar
- Beck, A., Canamero, L., Bard, K., Toward an affect space for robots to display body language. In proceedings of the International Symposium Re-thinking interaction with robots (Ro-Man 2010).Google Scholar
- M. Gillies, et al., "Responsive listening behavior, "Computer animation and virtual worlds, vol. 19, pp. 579--589, 2008. Google ScholarDigital Library
- Saerbeck, M. and Bartneck, C. Perception of affect elicited by robot motion, in Human-Robot Interaction (HRI2010), ACM/IEE, Editor. 2010, ACM/IEE: Osaka. p. 53--60. Google ScholarDigital Library
- Andry, P., Gaussier, P., Moga, S., Banquet, J.P. and Nadel, J. Learning and communication via imitation: an autonomous robot perspective. In Transactions on Systems, Man, and Cybernetics, vol 31, number 5, pp 431--442, 2001. Google ScholarDigital Library
- Andry, P., Garnault, N. and Gaussier, P. Using the interaction rhythm to build an internal reinforcement signal: a tool for intuitive HRI, in Proceedings of the Ninth International Conference on Epigenetic Robotics 2009.Google Scholar
Index Terms
- Interpretation of emotional body language displayed by robots
Recommendations
A design model of emotional body expressions in non-humanoid robots
HAI '14: Proceedings of the second international conference on Human-agent interactionRobotic emotional expressions could benefit social communication between humans and robots, if the cues such expressions contain were to be intelligible to human observers. In this paper, we present a design framework for modelling emotionally ...
Emotional body language displayed by artificial agents
Special Issue on Affective Interaction in Natural EnvironmentsComplex and natural social interaction between artificial agents (computer-generated or robotic) and humans necessitates the display of rich emotions in order to be believable, socially relevant, and accepted, and to generate the natural emotional ...
Generating Robotic Emotional Body Language of Targeted Valence and Arousal with Conditional Variational Autoencoders
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot InteractionNon-verbal communication that encompasses emotional body language is a crucial aspect of social robotics applications. Deep learning models for the generation of robotic expressions of bodily affect gain more and more ground recently over the hand-coded ...
Comments