Abstract
Understanding how people perceive robot gestures will aid the design of robots capable of social interaction with humans. We examined the generation and perception of a restricted form of gesture in a robot capable of simple head and arm movement, referring to point-light animation and video experiments in human motion to derive our hypotheses. Four studies were conducted to look at the effects of situational context, gesture complexity, emotional valence and author expertise. In Study 1, four participants created gestures with corresponding emotions based on 12 scenarios provided. The resulting gestures were judged by 12 participants in a second study. Participants’ recognition of emotion was better than chance and improved when situational context was provided. Ratings of lifelikeness were found to be related to the number of arm movements (but not head movements) in a gesture. In Study 3, five novices and five puppeteers created gestures conveying Ekman’s six basic emotions which were shown to 12 Study 4 participants. Puppetry experience improved identification rates only for the emotions of fear and disgust, possibly because of limitations with the robot’s movement. The results demonstrate the communication of emotion by a social robot capable of only simple head and arm movement.
Similar content being viewed by others
References
Breazeal C (2003) Toward sociable robots. Robot Auton Syst 42:167–175
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42:143–166
Mizoguchi H, Sato T, Takagi K, Nakao M, Hatamura Y (1997) Realization of expressive mobile robot. In: Proceedings of the international conference on robotics and automation, pp 581–586
Reeves B, Nass C (1996) The media equation. Cambridge University Press, Cambridge
Lee KM, Peng W, Jin S-A, Yan C (2006) Can robots manifest personality? An empirical test of personality recognition social responses and social presence in human–robot interaction. J Commun 56:754–772
Sidner C, Lee C, Morency L-P, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proc of ACM SIGCHI/SIGART conference on HRI, pp 290–296
Marui N, Matsumaru T (2005) Emotional motion of human-friendly robot: emotional expression with bodily movement as the motion media. Nippon Robotto Gakkai Gakujutsu Koenkai Yokoshu 23:2H12
Bacon F (1815) The works of sir Francis Bacon. Jones, London
McNeill D (1987) Psycholinguistics: a new approach. Harper Row, New York
Wachsmuth I, Lenzen M, Knoblich G (2008) Embodied communication in humans and machines. Oxford University Press, London
Argyle M (1994) The psychology of interpersonal behaviour, 5th edn. Penguin, London
Nehaniv C (2005) Classifying types of gesture and inferring intent. In: Proc AISB’05 symposium on robot companions the society for the study of artificial intelligence and simulation of behaviour, pp 74–81
Levy D (2007) Intimate relationships with artificial partners. PhD thesis University of Maastricht
Cassell J (2000) Embodied conversational interface agents. Commun ACM 43(4):70–78
Cassell J, Thorisson KR (1999) The power of a nod and a glance: envelope vs emotional feedback in animated conversational agents. Appl Artif Intell 13(4):519–538
Hodgins JK, O’Brien JF, Tumblin J (1998) Perception of human motion with different geometrical models. IEEE Trans Vis Comput Graph 4:307–317
Blake R, Shiffar M (2007) Perception of human motion. Annu Rev Psychol 58:47–73
Su M-H, Lee W-P, Wang J-H (2004) A user-oriented framework for the design and implementation of pet robots. In: Proceedings of the 2004 IEEE international conference on systems man and cybernetics 10–13 October 2004, The Hague Netherlands, IEEE, Piscataway, NJ
Silva DC, Vinhas V, Reis LP, Oliveira E (2009) Biometric emotion assessment and feedback in an immersive digital environment. Int J Soc Robot 1(4):301–317
Ekman P, Friesen WV, Ellsworth P (1972) Emotion in the human face: Guidelines for research and an integration of findings. Pergamon Press, New York
Schlossberg H (1954) Three dimensions of emotion. Psychol Rev 61:81–84
Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 23(2):133–152
Atkinson AP, Dittrich WH, Gemmell AJ, Young AW (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33:717–746
de Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13:247–268
Dittrich WH, Troscianko T, Lea S, Morgan D (1996) Perception of emotion from dynamic point-light displays represented in dance. Perception 25:727–738
Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82:B51–B61
Clarke TJ, Bradshaw MF, Field DT, Hampson SE, Rose D (2005) The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34(10):1171–1180
Shaarani AS, Romano DM (2006) Basic emotions from body movements. In: (CCID 2006) The first international symposium on culture creativity and interaction design HCI 2006 workshops, the 20th BCS HCI group conference, Queen Mary University of London, UK
Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage and coding. Semiotica 1:49–98
Rosenthal R, DePaulo B (1979) Sex differences in eavesdropping on nonverbal cues. J Pers Soc Psychol 37(2):273–285
McNeill D (2005) Gesture and thought. The University of Chicago Press, Chicago
Kret ME, de Gelder B (2010) Recognition of emotion in body postures is influenced by social context. Exp Brain Res 206(1):169–180
Frijda N (1986) Emotions. Cambridge University Press, Cambridge
Atkinson A, Tunstall M, Dittrich W (2007) Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures. Cognition 104(1):59–72
Sawada M, Suda K, Ishii M (2003) Expression of emotions in dance: relation between arm movement characteristics and emotion. Percept Mot Skills 97:697–708
Rakison DH, Poulin-Dubois D (2001) Developmental origin of the animate–inanimate distinction. Psychol Bull 2:209–228
Leslie AM (1994) ToMM ToBy and agency: core architecture and domain specificity. In: Hirschfield L, Gelman S (eds) Mapping the mind: domain specificity in cognition and culture. Cambridge University Press, Cambridge, pp 119–148
Morewedge C, Preston J, Wegner D (2007) Timescale bias in the attribution of mind. J Pers Soc Psychol 93(1):1–11
Opfer J (2002) Identifying living and sentient kinds from dynamic information: the case of goal-directed versus aimless autonomous movement in conceptual change. Cognition 86:97–122
Premack D (1990) The infant’s theory of self-propelled objects. Cognition 36:1–16
Gelman R, Durgin F, Kaufman L (1995) Distinguishing between animates and inanimates: not by motion alone. In: Sperber S, Premack D, Premack A (eds) Causal cognition: a multi-disciplinary debate. Oxford University Press, Cambridge, pp 150–184
Bassili JN (1976) Temporal and spatial contingencies in the perception of social events. J Pers Soc Psychol 33:680–685
Tremoulet PD, Feldman J (2000) Perception of animacy from the motion of a single object. Perception 29:943–951
Trafton J, Trickett S, Stitzlein C, Saner L, Schunn C, Kirschenbaum S (2006) The relationship between spatial transformations and iconic gestures. Spat Cogn Comput 6(1):1–29
Chase WG, Simon HA (1974) Perception in chess. Cogn Psychol 4:55–81
Chi MTH, Feltovich PJ, Glaser R (1981) Categorization and representation of physics problems by experts and novices. Cogn Sci 5:121–152
Loula F, Prasad S, Harber K, ShiVrar M (2005) Recognizing people from their movement. J Exp Psychol Hum Percept Perform 31:210–220
Latshaw G (1978) The complete book of puppetry. Dover, Mineola
Blumenthal E (2005) Puppetry: a world history. Harry N Abrams, New York
Logan D (2007) Puppetry. Brisbane Dramatic Arts Company, Brisbane
Sturman D (1998) Computer puppetry. IEEE Comput Graph Appl 18(1):38–45
Fukuda H, Ueda K (2010) Interaction with a moving object affects one’s perception of its animacy. Int J Soc Robot 2(2):187–193
Lim H, Ishii A, Takanishi A (1999) Basic emotional walking using a biped humanoid robot. In: Proceedings of the IEEE SMC 1999
Nakagawa K, Shinozawa K, Ishiguro H, Akimoto T, Hagita N (2009) Motion modification method to control affective nuances for robots. In: Proceedings of the 2009 IEEE/RSJ international conference on intelligent robots and systems, pp 3727–3734
Biocca F (1997) The cyborg’s dilemma: progressive embodiment in virtual environments. J Comput-Mediat Commun 3(2). Available: http://www.ascusc.org/jcmc/vol3/issue2/biocca2.html
Kidd C, Breazeal C (2005) Comparison of social presence in robots and animated characters. In: Proc of human-computer interaction (CHI)
Ono T, Ishiguro H, Imai M (2001) A model of embodied communications with gestures between humans and robots. In: Proceedings of 23rd annual meeting of the cognitive science society, Mahlwal. Erlbaum, Hillsdale
Kanda T, Ishiguro H, Imai M, Ono T (2003) Body movement analysis of human-robot interaction. In: Proc of int joint conference on artificial intelligence (IJCAI 2003), pp 177–182
Zlatev J (1999) The epigenesis of meaning in human beings and possibly in robots. Lund University Cognitive Studies 79, Lund University
Demiris J, Hayes G (1999) Active and passive routes to imitation. In: Proceedings of the AISB symposium on imitation in animals and artifacts
Xing S, Chen I-M (2002) Design expressive behaviors for robotic puppet. In: Proceedings of 7th international conference on control automation robotics and vision (ICARCV ’02), Dec 2002, Singapore, pp 378–382
Plaisant C, Druin A, Lathan C, Dakhane K, Edwards K, Vice JM, Montemayor J (2000) A storytelling robot for pediatric rehabilitation. In: Proc ASSETS ’00
Sabanovic S, Meisner E, Caporael L, Isler V, Trinkle J (2009) Outside-in design for interdisciplinary HRI research. In 2009 AAAI spring symposium on experimental design for real-world systems
Meisner E, Sabanovic S, Isler Volkan Caporael L, Trinkle J (2009) ShadowPlay: a generative model for nonverbal human-robot interaction. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction (HRI’09) 11–13 March 2009, La Jolla, California. ACM, New York
Sekiguchi D, Inami M, Tachi S (2004) The design of internet-based RobotPHONE. In: Proceedings of 14th international conference on artificial reality, pp 223–228
Nomura T, Suzuki T, Kanda T, Kato K (2006) Altered attitudes of people toward robots: investigation through the negative attitudes toward robots scale. In: Proc AAAI-06 workshop on human implications of human-robot interaction, pp 29–35
Nomura T, Nakao A (2010) Comparison on identification of affective body motions by robots between elder people and university students: a case study in Japan. Int J Soc Robot 2(2):147–157
Itoh K, Miwa H, Matsumoto M, Zecca M, Takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A (2004) Various emotion expression humanoid robot WE-4RII. In: 1st IEEE technical exhibition based conference on robotics and automation (TExCRA 2004), November 18–19, 2004, Tokyo, Japan, pp 35–36
Carpenter J, Davis J, Erwin-Stewart N, Lee T, Bransford J, Vye N (2009) Gender representation and humanoid robots designed for domestic use. Int J Soc Robot 1(3):261–265
Tanaka A, Koizumi A, Imai H, Hiramatsu S, Hiramoto E, de Gelder B (2010) I feel your voice: cultural differences in the multisensory perception of emotion. Psychol Sci (in press). doi:10.1177/0956797610380698
Beattie G (2003) Visible thought: the new psychology of body language. Routledge, London
Author information
Authors and Affiliations
Corresponding author
Additional information
Funding provided by the Japan Society for the Promotion of Science (JSPS), the Natural Sciences and Engineering Research Council of Canada (NSERC), Bell University Labs and the University of Toronto.
Rights and permissions
About this article
Cite this article
Li, J., Chignell, M. Communication of Emotion in Social Robots through Simple Head and Arm Movements. Int J of Soc Robotics 3, 125–142 (2011). https://doi.org/10.1007/s12369-010-0071-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-010-0071-x