Skip to main content

Perceptions of Quantitative and Affective Meaning from Humanoid Robot Hand Gestures

  • Conference paper
  • First Online:
Social Robotics (ICSR 2021)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13086))

Included in the following conference series:

Abstract

People use their hands in a variety of ways to communicate information along with speech during face-to-face conversation. Humanoid robots designed to converse with people need to be able to use their hands in similar ways, both to increase the naturalness of the interaction and to communicate additional information in the same way people do. However, there are few studies of the particular meanings that people derive from robot hand gestures, particularly for more abstract gestures such as so-called metaphoric gestures that may be used to communicate quantitative or affective information. We conducted an exhaustive study of the 51 hand gestures built into a commercial humanoid robot to determine the quantitative and affective meaning that people derive from observing them without accompanying speech. We find that hypotheses relating gesture envelope parameters (e.g., height, distance from body) to metaphorically corresponding quantitative and affective concepts are largely supported.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    All videos are available at: https://tinyurl.com/PepperGesture.

References

  1. McNeill, D.: Hand and Mind: What Gestures Reveal about Thought. Cambridge University Press, Cambridge (1992)

    Google Scholar 

  2. Zhou, S., Murali, P., Underhill-Blazey, M., Bickmore, T.: Cancer genetic counseling by humanoid robot: modeling multimodal communication of health risk. In: ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2020)

    Google Scholar 

  3. Spiro, H.: Empathy: an introduction. In: Spiro, H., McCrea, M., Peschel, E., St. James, D. (eds.) Empathy and the Practice of Medicine, pp. 1–6. Yale University Press, New Haven (1993)

    Google Scholar 

  4. Lakoff, G., Johnson, M.: Metaphors We Live By. University of Chicago Press, Chicago (1980)

    Google Scholar 

  5. Saund, C., Roth, M., Chollet, M., Marsella, S.: Multiple metaphors in metaphoric gesturing. In: International Conference on Affective Computing and Intelligent Interaction (ACII) (2019)

    Google Scholar 

  6. Russel, J.: A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980)

    Article  Google Scholar 

  7. Kipp, M., Neff, M., Albrecht, I.: An annotation scheme for conversational gestures: how to economically capture timing and form. Lang. Resour. Eval. 41(3/4), 325–339 (2007)

    Article  Google Scholar 

  8. Lipkus, I., Hollands, J.: The visual communication of risk. JNCI Monogr. 25, 149–163 (1999)

    Article  Google Scholar 

  9. Gonzalez-Caban, A., Sanchez, J.E.: Minority households’ willingness to pay for public and private wildfire risk reduction in Florida. Int J Wildland Fire 26(8), 774–753 (2017)

    Article  Google Scholar 

  10. Roth, W.: Gestures: their role in teaching and learning. Rev. Educ. Res. 71(3), 365–392 (2001)

    Article  Google Scholar 

  11. Alibali, M., Nathan, M.: Embodiment in mathematics teaching and learning: evidence from learners’ and teachers’ gestures. J. Learn. Sci. 21(2), 247–286 (2012)

    Article  Google Scholar 

  12. Goldin-Meadow, S., Nusbaum, H., Kelly, S.E., Wagner, S.: Explaining math: gesturing lightens the load. Psychol. Sci. 12(6), 516–522 (2001)

    Article  Google Scholar 

  13. Alibali, M., Nathan, M., Church, R., Wolfgram, M., Kim, S., Knuth, E.: Teachers’ gestures and speech in mathematics lessons: forging common ground by resolving trouble spots. ZDM 45, 425–440 (2013)

    Article  Google Scholar 

  14. Alibali, M., Nathan, M.: Teachers’ gestures as a means of scaffolding students’ understanding: evidence from an early algebra lesson. In: Video Research in the Learning Sciences. Routledge, London (2007)

    Google Scholar 

  15. Alibali, M., et al.: How teachers link ideas in mathematics instruction using speech and gesture: a corpus analysis. Cogn. Instr. 32(1), 65–100 (2015)

    Article  Google Scholar 

  16. Alibali, M., Young, A., Crooks, N., Yeo, A.: Students learn more when their teacher has learned to gesture effectively. Gesture 13(2), 210–233 (2013)

    Article  Google Scholar 

  17. Huang, C., Mutlu, B.: Modeling and evaluating narrative gestures for humanlike robots. Robot. Sci. Syst. (2013)

    Google Scholar 

  18. van Dijk, E.T., Torta, E., Cuijpers, R.H.: Effects of eye contact and iconic gestures on message retention in human-robot interaction. Int. J. Soc. Robot. 5(4), 491–501 (2013). https://doi.org/10.1007/s12369-013-0214-y

    Article  Google Scholar 

  19. Kok, K., Bergmann, K., Kopp, S.: Not so dependent after all: functional perception of speakers’ gestures with and without speech. Gesture and Speech in Interaction Conference (GESPIN-4) (2013)

    Google Scholar 

  20. Salem, M., Kopp, S., Wachsmuth, I., Rohlfing, K., Joublin, F.: Generation and evaluation of communicative robot gesture. Int. J. Soc. Robot. 4(2), 201–217 (2012)

    Article  Google Scholar 

  21. Bremner, P., Leonards, U.: Iconic gestures for robot avatars, recognition and integration with speech. Front. Psychol. 7(183) (2016)

    Google Scholar 

  22. Lhommet, M., Marsella, S.: Metaphoric gestures: towards grounded mental spaces. In: Bickmore, T., Marsella, S., Sidner, C. (eds.) Intelligent Virtual Agents, pp. 264–274. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-09767-1_34

    Chapter  Google Scholar 

  23. Lhommet, M., Marsella, S.: Proceedings of the Cognitive Science Society (2016)

    Google Scholar 

  24. Lhommet, M., Marsella, S.: Intelligent Virtual Agents (2013)

    Google Scholar 

  25. Cassell, J., Vilhjálmsson, H., Bickmore, T.: BEAT: The Behavior Expression Animation Toolkit. SIGGRAPH 2001, Los Angeles, CA (2001)

    Google Scholar 

  26. Salem, M., Kopp, S., Wachsmuth, I., Rohlfing, K., Joublin, F.: Generation and evaluation of communicative robot gesture. Int. J. Soc. Robot. 4, 201–217 (2012)

    Article  Google Scholar 

  27. Chidambaram, V., Chiang, Y., Mutlu, B.: Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues. In: Human-Robot Interaction (HRI) (2012)

    Google Scholar 

  28. Han, J., Campbell, N., Jokinen, K., Wilcock, G.: Investigating the use of non-verbal cues in human-robot interaction with a Nao robot. In: IEEE International Conference on Cognitive Infocommunications (2012)

    Google Scholar 

  29. Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., Joublin, F.: To err is human(-like): effects of robot gesture on perceived anthropomorphism and likability. Int. J. Soc. Robot. 5(3), 313–323 (2013)

    Article  Google Scholar 

  30. Deshmukh, A., Craenen, B., Vinciarelli, A., Foster, M.: Shaping robot gestures to shape users’ perception: the effect of amplitude and speed on Godspeed ratings. 6th International Conference on Human-Agent Interaction (HAI 2018) (2018)

    Google Scholar 

  31. Kilner, J., Pualignan, Y., Blakemore, S.J.: An interference effect of observed biological movement on action. Curr. Biol. 13(6), 522–525 (2003)

    Article  Google Scholar 

  32. Gilbert, C.: Vader: a parsimonious rule-based model for sentiment analysis of social media text. Eighth International Conference on Weblogs and Social Media (ICWSM-14) (2014)

    Google Scholar 

Download references

Acknowledgements

Sumanth Munikoti assisted in generating the experimental stimuli, and Nwabisi Chikwendu assisted with the online study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Timothy Bickmore .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bickmore, T., Murali, P., Terzioglu, Y., Zhou, S. (2021). Perceptions of Quantitative and Affective Meaning from Humanoid Robot Hand Gestures. In: Li, H., et al. Social Robotics. ICSR 2021. Lecture Notes in Computer Science(), vol 13086. Springer, Cham. https://doi.org/10.1007/978-3-030-90525-5_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-90525-5_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-90524-8

  • Online ISBN: 978-3-030-90525-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics