Abstract
People use their hands in a variety of ways to communicate information along with speech during face-to-face conversation. Humanoid robots designed to converse with people need to be able to use their hands in similar ways, both to increase the naturalness of the interaction and to communicate additional information in the same way people do. However, there are few studies of the particular meanings that people derive from robot hand gestures, particularly for more abstract gestures such as so-called metaphoric gestures that may be used to communicate quantitative or affective information. We conducted an exhaustive study of the 51 hand gestures built into a commercial humanoid robot to determine the quantitative and affective meaning that people derive from observing them without accompanying speech. We find that hypotheses relating gesture envelope parameters (e.g., height, distance from body) to metaphorically corresponding quantitative and affective concepts are largely supported.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
All videos are available at: https://tinyurl.com/PepperGesture.
References
McNeill, D.: Hand and Mind: What Gestures Reveal about Thought. Cambridge University Press, Cambridge (1992)
Zhou, S., Murali, P., Underhill-Blazey, M., Bickmore, T.: Cancer genetic counseling by humanoid robot: modeling multimodal communication of health risk. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2020)
Spiro, H.: Empathy: an introduction. In: Spiro, H., McCrea, M., Peschel, E., St. James, D. (eds.) Empathy and the Practice of Medicine, pp. 1–6. Yale University Press, New Haven (1993)
Lakoff, G., Johnson, M.: Metaphors We Live By. University of Chicago Press, Chicago (1980)
Saund, C., Roth, M., Chollet, M., Marsella, S.: Multiple metaphors in metaphoric gesturing. In: International Conference on Affective Computing and Intelligent Interaction (ACII) (2019)
Russel, J.: A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980)
Kipp, M., Neff, M., Albrecht, I.: An annotation scheme for conversational gestures: how to economically capture timing and form. Lang. Resour. Eval. 41(3/4), 325–339 (2007)
Lipkus, I., Hollands, J.: The visual communication of risk. JNCI Monogr. 25, 149–163 (1999)
Gonzalez-Caban, A., Sanchez, J.E.: Minority households’ willingness to pay for public and private wildfire risk reduction in Florida. Int J Wildland Fire 26(8), 774–753 (2017)
Roth, W.: Gestures: their role in teaching and learning. Rev. Educ. Res. 71(3), 365–392 (2001)
Alibali, M., Nathan, M.: Embodiment in mathematics teaching and learning: evidence from learners’ and teachers’ gestures. J. Learn. Sci. 21(2), 247–286 (2012)
Goldin-Meadow, S., Nusbaum, H., Kelly, S.E., Wagner, S.: Explaining math: gesturing lightens the load. Psychol. Sci. 12(6), 516–522 (2001)
Alibali, M., Nathan, M., Church, R., Wolfgram, M., Kim, S., Knuth, E.: Teachers’ gestures and speech in mathematics lessons: forging common ground by resolving trouble spots. ZDM 45, 425–440 (2013)
Alibali, M., Nathan, M.: Teachers’ gestures as a means of scaffolding students’ understanding: evidence from an early algebra lesson. In: Video Research in the Learning Sciences. Routledge, London (2007)
Alibali, M., et al.: How teachers link ideas in mathematics instruction using speech and gesture: a corpus analysis. Cogn. Instr. 32(1), 65–100 (2015)
Alibali, M., Young, A., Crooks, N., Yeo, A.: Students learn more when their teacher has learned to gesture effectively. Gesture 13(2), 210–233 (2013)
Huang, C., Mutlu, B.: Modeling and evaluating narrative gestures for humanlike robots. Robot. Sci. Syst. (2013)
van Dijk, E.T., Torta, E., Cuijpers, R.H.: Effects of eye contact and iconic gestures on message retention in human-robot interaction. Int. J. Soc. Robot. 5(4), 491–501 (2013). https://doi.org/10.1007/s12369-013-0214-y
Kok, K., Bergmann, K., Kopp, S.: Not so dependent after all: functional perception of speakers’ gestures with and without speech. Gesture and Speech in Interaction Conference (GESPIN-4) (2013)
Salem, M., Kopp, S., Wachsmuth, I., Rohlfing, K., Joublin, F.: Generation and evaluation of communicative robot gesture. Int. J. Soc. Robot. 4(2), 201–217 (2012)
Bremner, P., Leonards, U.: Iconic gestures for robot avatars, recognition and integration with speech. Front. Psychol. 7(183) (2016)
Lhommet, M., Marsella, S.: Metaphoric gestures: towards grounded mental spaces. In: Bickmore, T., Marsella, S., Sidner, C. (eds.) Intelligent Virtual Agents, pp. 264–274. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-09767-1_34
Lhommet, M., Marsella, S.: Proceedings of the Cognitive Science Society (2016)
Lhommet, M., Marsella, S.: Intelligent Virtual Agents (2013)
Cassell, J., Vilhjálmsson, H., Bickmore, T.: BEAT: The Behavior Expression Animation Toolkit. SIGGRAPH 2001, Los Angeles, CA (2001)
Salem, M., Kopp, S., Wachsmuth, I., Rohlfing, K., Joublin, F.: Generation and evaluation of communicative robot gesture. Int. J. Soc. Robot. 4, 201–217 (2012)
Chidambaram, V., Chiang, Y., Mutlu, B.: Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues. In: Human-Robot Interaction (HRI) (2012)
Han, J., Campbell, N., Jokinen, K., Wilcock, G.: Investigating the use of non-verbal cues in human-robot interaction with a Nao robot. In: IEEE International Conference on Cognitive Infocommunications (2012)
Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., Joublin, F.: To err is human(-like): effects of robot gesture on perceived anthropomorphism and likability. Int. J. Soc. Robot. 5(3), 313–323 (2013)
Deshmukh, A., Craenen, B., Vinciarelli, A., Foster, M.: Shaping robot gestures to shape users’ perception: the effect of amplitude and speed on Godspeed ratings. 6th International Conference on Human-Agent Interaction (HAI 2018) (2018)
Kilner, J., Pualignan, Y., Blakemore, S.J.: An interference effect of observed biological movement on action. Curr. Biol. 13(6), 522–525 (2003)
Gilbert, C.: Vader: a parsimonious rule-based model for sentiment analysis of social media text. Eighth International Conference on Weblogs and Social Media (ICWSM-14) (2014)
Acknowledgements
Sumanth Munikoti assisted in generating the experimental stimuli, and Nwabisi Chikwendu assisted with the online study.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Bickmore, T., Murali, P., Terzioglu, Y., Zhou, S. (2021). Perceptions of Quantitative and Affective Meaning from Humanoid Robot Hand Gestures. In: Li, H., et al. Social Robotics. ICSR 2021. Lecture Notes in Computer Science(), vol 13086. Springer, Cham. https://doi.org/10.1007/978-3-030-90525-5_33
Download citation
DOI: https://doi.org/10.1007/978-3-030-90525-5_33
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-90524-8
Online ISBN: 978-3-030-90525-5
eBook Packages: Computer ScienceComputer Science (R0)