skip to main content
10.1145/3371382.3377432acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Body Language in Affective Human-Robot Interaction

Published: 01 April 2020 Publication History

Abstract

Social human--robot interaction is concerned with exploring the ways in which social interaction can be achieved between a human and a sociable robot. Affect has an important role in interaction as it helps interactants coordinate and indicate the success of the communication. Designing socially intelligent robots requires competence in communication, which includes exchanges of both verbal and non--verbal cues. This project will focus on non--verbal communication, more specifically body movements, postures and gestures as means of conveying socially affective information. Using the affective grounding perspective, which conceptualizes emotion as a coordination mechanism, together with honest signals as a measurement of the dynamics of the interaction, and the robot Pepper, we aim to develop a system that would be able to communicate affect, with the goal to enhance affective human--robot interaction.

References

[1]
Michael Argyle. 1969. Social Interaction : Process and Products .Routledge, New York .
[2]
Michael Argyle. 1975. Bodily Communication .Methuen & Co Ltd, London .
[3]
Anthony P. Atkinson, Winand H. Dittrich, Andrew J. Gemmell, and Andrew W. Young. 2004. Emotion Perception from Dynamic and Static Body Expressions in Point -Light and Full -Light Displays. Perception, Vol. 33, 6 (2004), 717--746. https://doi.org/10.1068/p5096
[4]
Jeremy N. Bailenson and Nick Yee. 2005. Digital Chameleons : Automatic Assimilation of Nonverbal Gestures in Immersive Virtual Environments. Psychological Science, Vol. 16, 10 (2005), 814--819. https://doi.org/10.1111/j.1467--9280.2005.01619.x
[5]
Aryel Beck, Lola Ca namero, Antoine Hiolle, Luisa Damiano, Piero Cosi, Fabio Tesser, and Giacomo Sommavilla. 2013. Interpretation of Emotional Body Language Displayed by a Humanoid Robot : A Case Study with Children. International Journal of Social Robotics, Vol. 5, 3 (2013), 325--334. https://doi.org/10.1007/s12369-013-0193-z
[6]
Antonio Camurri, Ingrid Lagerlöf, and Gualtiero Volpe. 2003. Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. International Journal of Human-Computer Studies, Vol. 59, 1--2 (2003), 213--225. https://doi.org/10.1016/S1071--5819(03)00050--8
[7]
Zhe Cao, Gines Hidalgo, Tomas Simon, Shih-En Wei, and Yaser Sheikh. 2018. OpenPose : Realtime Multi -Person 2D Pose Estimation Using Part Affinity Fields. arXiv:1812.08008 [cs] (2018). arxiv: cs/1812.08008
[8]
Mina Choi, Rachel Kornfield, Leila Takayama, and Bilge Mutlu. 2017. Movement Matters : Effects of Motion and Mimicry on Perception of Similarity and Closeness in Robot -Mediated Communication. In 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, Denver, Colorado, USA, 325--335. https://doi.org/10.1145/3025453.3025734
[9]
Vivian Chu, Kalesha Bullard, and Andrea L. Thomaz. 2014. Multimodal Real-Time Contingency Detection for HRI. In 2014 IEEE /RSJ International Conference on Intelligent Robots and Systems . 3327--3332. https://doi.org/10.1109/IROS.2014.6943025
[10]
Iris Cohen, Rosemarijn Looije, and Mark Neerincx. 2014. Child's Perception of Robot 's Emotions : Effects of Platform, Context and Experience. International Journal of Social Robotics, Vol. 6, 4 (2014), 507--518. https://doi.org/10.1007/s12369-014-0230--6
[11]
Mark Coulson. 2004. Attributing Emotion to Static Body Postures : Recognition Accuracy, Confusions, and Viewpoint Dependence. Journal of Nonverbal Behavior, Vol. 28, 2 (2004), 117--139. https://doi.org/10.1023/B:JONB.0000023655.25550.be
[12]
Emilie Delaherche, Mohamed Chetouani, Ammar Mahdhaoui, Catherine Saint-Georges, Sylvie Viaux, and David Cohen. 2012. Interpersonal Synchrony : A Survey of Evaluation Methods across Disciplines. IEEE Transactions on Affective Computing, Vol. 3, 3 (2012), 349--365. https://doi.org/10.1109/T-AFFC.2012.12
[13]
Paul Ekman. 1992. An Argument for Basic Emotions. Cognition and Emotion, Vol. 6, 3--4 (1992), 169--200. https://doi.org/10.1080/02699939208411068
[14]
Mustafa Suphi Erden. 2013. Emotional Postures for the Humanoid -Robot Nao. International Journal of Social Robotics, Vol. 5, 4 (2013), 441--456. https://doi.org/10.1007/s12369-013-0200--4
[15]
Kerstin Fischer, Malte F. Jung, Lars C. Jensen, and Maria V. aus der Wieschen. 2019. Emotion Expression in HRI textendash When and Why. In 2019 14th ACM /IEEE International Conference on Human -Robot Interaction (HRI ) . 29--38. https://doi.org/10.1109/HRI.2019.8673078
[16]
Luis A. Fuente, Hannah Ierardi, Michael Pilling, and Nigel T. Crook. 2015. Influence of Upper Body Pose Mirroring in Human -Robot Interaction. In Social Robotics, Adriana Tapus, Elisabeth André, Jean-Claude Martin, Franc cois Ferland, and Mehdi Ammi (Eds.). Springer International Publishing, 214--223.
[17]
Joanna Hale and Antonia F. De C. Hamilton. 2016. Testing the Relationship between Mimicry, Trust and Rapport in Virtual Reality Conversations. Scientific Reports, Vol. 6 (2016), 35295. https://doi.org/10.1038/srep35295
[18]
Markus H"aring, Nikolaus Bee, and Elisabeth André. 2011. Creation and Evaluation of Emotion Expression with Body Movement, Sound and Eye Color for Humanoid Robots. In 2011 RO -MAN . 204--209. https://doi.org/10.1109/ROMAN.2011.6005263
[19]
Syed Khursheed Hasnain, Philippe Gaussier, and Ghiles Mostafaoui. 2012a. “Synchrony ” as a Way to Choose an Interacting Partner. In 2012 IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL ) . 1--6. https://doi.org/10.1109/DevLrn.2012.6400882
[20]
Syed Khursheed Hasnain, Ghiles Mostafaoui, and Philippe Gaussier. 2012b. A Synchrony-Based Perspective for Partner Selection and Attentional Mechanism in Human-Robot Interaction. Paladyn, Vol. 3, 3 (2012), 156--171. https://doi.org/10.2478/s13230-013-0111-y
[21]
Ryosuke Hasumoto, Kazuhiro Nakadai, and Michita Imai. 2019. Reactive Chameleon : A Method to Mimic Conversation Partner 's Body Sway for a Robot. International Journal of Social Robotics (2019). https://doi.org/10.1007/s12369-019-00557--4
[22]
Malte F. Jung. 2017. Affective Grounding in Human -Robot Interaction. In Proceedings of the 2017 ACM /IEEE International Conference on Human -Robot Interaction - HRI '17. 263--273. https://doi.org/10.1145/2909824.3020224
[23]
Heather Knight and Reid Simmons. 2016. Laban Head-Motions Convey Robot State: A Call for Robot Body Language. In 2016 IEEE International Conference on Robotics and Automation (ICRA ) . IEEE, Stockholm, Sweden, 2881--2888. https://doi.org/10.1109/ICRA.2016.7487451
[24]
Nikolaos Kofinas, Emmanouil Orfanoudakis, and Michail G. Lagoudakis. 2015. Complete Analytical Forward and Inverse Kinematics for the NAO Humanoid Robot. Journal of Intelligent & Robotic Systems, Vol. 77, 2 (2015), 251--264. https://doi.org/10.1007/s10846-013-0015--4
[25]
Hideki Kozima, Cocoro Nakagawa, and Yuriko Yasuda. 2005. Interactive Robots for Communication-Care: A Case-Study in Autism Therapy. In ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005. 341--346. https://doi.org/10.1109/ROMAN.2005.1513802
[26]
Jin J. Lee, Fei Sha, and Cynthia Breazeal. 2019. A Bayesian Theory of Mind Approach to Nonverbal Communication. In 2019 14th ACM /IEEE International Conference on Human -Robot Interaction (HRI ). 487--496. https://doi.org/10.1109/HRI.2019.8673023
[27]
Jamy Li, Wendy Ju, and Cliff Nass. 2015. Observer Perception of Dominance and Mirroring Behavior in Human -Robot Relationships. In Tenth Annual ACM /IEEE International Conference on Human -Robot Interaction (HRI '15). ACM, Portland, Oregon, USA . https://doi.org/10.1145/2696454.2696459
[28]
Diana Löffler, Nina Schmidt, and Robert Tscharn. 2018. Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound. In Proceedings of the 2018 ACM /IEEE International Conference on Human -Robot Interaction - HRI '18. ACM Press, Chicago, IL, USA, 334--343. https://doi.org/10.1145/3171221.3171261
[29]
Derek McColl and Goldie Nejat. 2014. Recognizing Emotional Body Language Displayed by a Human -like Social Robot. International Journal of Social Robotics, Vol. 6, 2 (2014), 261--280. https://doi.org/10.1007/s12369-013-0226--7
[30]
H. Miwa, K. Itoh, M. Matsumoto, M. Zecca, H. Takanobu, S. Rocella, M. C. Carrozza, P. Dario, and A. Takanishi. 2004. Effective Emotional Expressions with Expression Humanoid Robot WE -4RII : Integration of Humanoid Robot Hand RCH -1. In 2004 IEEE /RSJ International Conference on Intelligent Robots and Systems (IROS ) (IEEE Cat. No .04CH37566 ), Vol. 3. 2203--2208. https://doi.org/10.1109/IROS.2004.1389736
[31]
Alex (Sandy) Pentland. 2008. Honest Signals : How They Shape Our World .The MIT Press .
[32]
Charles Rich, Brett Ponsler, Aaron Holroyd, and Candace L. Sidner. 2010. Recognizing Engagement in Human-Robot Interaction. In 2010 5th ACM /IEEE International Conference on Human -Robot Interaction (HRI ) . 375--382. https://doi.org/10.1109/HRI.2010.5453163
[33]
James A. Russell. 1980. A Circumplex Model of Affect. Journal of Personality and Social Psychology, Vol. 39, 6 (1980), 1161--1178. https://doi.org/10.1037/h0077714
[34]
Martin Saerbeck and Christoph Bartneck. 2010. Perception of Affect Elicited by Robot Motion. In Fifth ACM /IEEE International Conference on Human -Robot Interaction - HRI '10. 53--60. https://doi.org/10.1145/1734454.1734473
[35]
Simon Senecal, Louis Cuel, Andreas Aristidou, and Nadia Magnenat-Thalmann. 2016. Continuous Body Emotion Recognition System during Theater Performances: Continuous Body Emotion Recognition. Computer Animation and Virtual Worlds, Vol. 27, 3--4 (2016), 311--320. https://doi.org/10.1002/cav.1714
[36]
Reid Simmons and Heather Knight. 2017. Keep on Dancing: Effects of Expressive Motion Mimicry. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO -MAN ) . https://doi.org/10.1109/ROMAN.2017.8172382
[37]
Darja Stoeva. 2018. Robot-Assisted Game Incorporating Emotional Postures for Children with ASD . Master Thesis. Heriot Watt University, Edinburgh, UK .
[38]
J. Stolzenwald and P. Bremner. 2017. Gesture Mimicry in Social Human-Robot Interaction. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO -MAN ) . 430--436. https://doi.org/10.1109/ROMAN.2017.8172338
[39]
Christiana Tsiourti, Astrid Weiss, Katarzyna Wac, and Markus Vincze. 2017. Designing Emotionally Expressive Robots : A Comparative Study on the Perception of Communication Modalities. In 2017 5th International Conference on Human Agent Interaction (HAI '17). ACM, Bielefeld, Germany, 213--222. https://doi.org/10.1145/3125739.3125744
[40]
Nguyen T. V. Tuyen, Sungmoon Jeong, and Nak Y. Chong. 2018. Emotional Bodily Expressions for Culturally Competent Robots through Long Term Human -Robot Interaction. In 2018 IEEE /RSJ International Conference on Intelligent Robots and Systems (IROS ) . 2008--2013. https://doi.org/10.1109/IROS.2018.8593974
[41]
Ishabel M. Vicaria and Leah Dickens. 2016. Meta-Analyses of the Intra - and Interpersonal Outcomes of Interpersonal Coordination. Journal of Nonverbal Behavior, Vol. 40, 4 (2016), 335--361. https://doi.org/10.1007/s10919-016-0238--8
[42]
Massimiliano Zecca, Yu Mizoguchi, Keita Endo, Fumiya Iida, Yousuke Kawabata, Nobutsuna Endo, Kazuko Itoh, and Atsuo Takanishi. 2009. Whole Body Emotion Expressions for KOBIAN Humanoid Robot textemdash Preliminary Experiments with Different Emotional Patterns textemdash. In RO -MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication . 381--386. https://doi.org/10.1109/ROMAN.2009.5326184

Cited By

View all
  • (2024)Body Movement Mirroring and Synchrony in Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/368207413:4(1-26)Online publication date: 23-Oct-2024
  • (2024)Modulating Perceived Authority and Warmth of Mobile Social Robots Through Bodily Openness and Vertical Movement in GaitIEEE Robotics and Automation Letters10.1109/LRA.2024.34363389:9(7971-7978)Online publication date: Sep-2024
  • (2024)The Role of Affective Computing in Social Justice: Harnessing Equity and InclusionAffective Computing for Social Good10.1007/978-3-031-63821-3_4(69-89)Online publication date: 8-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
March 2020
702 pages
ISBN:9781450370578
DOI:10.1145/3371382
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 April 2020

Check for updates

Author Tags

  1. affective computing
  2. body language
  3. human--robot interaction
  4. robotics
  5. social signals

Qualifiers

  • Abstract

Funding Sources

  • Technische Universität Wien, Doctoral College TrustRobots

Conference

HRI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 192 of 519 submissions, 37%

Upcoming Conference

HRI '25
ACM/IEEE International Conference on Human-Robot Interaction
March 4 - 6, 2025
Melbourne , VIC , Australia

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)93
  • Downloads (Last 6 weeks)12
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Body Movement Mirroring and Synchrony in Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/368207413:4(1-26)Online publication date: 23-Oct-2024
  • (2024)Modulating Perceived Authority and Warmth of Mobile Social Robots Through Bodily Openness and Vertical Movement in GaitIEEE Robotics and Automation Letters10.1109/LRA.2024.34363389:9(7971-7978)Online publication date: Sep-2024
  • (2024)The Role of Affective Computing in Social Justice: Harnessing Equity and InclusionAffective Computing for Social Good10.1007/978-3-031-63821-3_4(69-89)Online publication date: 8-Oct-2024
  • (2024)Unravelling the Robot Gestures Interpretation by Children with Autism Spectrum Disorder During Human-Robot InteractionArtificial Intelligence for Neuroscience and Emotional Systems10.1007/978-3-031-61140-7_33(342-355)Online publication date: 31-May-2024
  • (2023)User Experience Design for Social Robots: A Case Study in Integrating EmbodimentSensors10.3390/s2311527423:11(5274)Online publication date: 1-Jun-2023
  • (2023)MoEmo Vision Transformer: Integrating Cross-Attention and Movement Vectors in 3D Pose Estimation for HRI Emotion Detection2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS55552.2023.10342417(9846-9852)Online publication date: 1-Oct-2023
  • (2023)Human Robot Interaction: Identifying Resembling Emotions Using Dynamic Body Gestures of Robot2023 3rd International Conference on Artificial Intelligence (ICAI)10.1109/ICAI58407.2023.10136649(39-44)Online publication date: 22-Feb-2023
  • (2022)The Effect of Exaggerated Nonverbal Cues on the Perception of the Robot PepperProceedings of the 10th International Conference on Human-Agent Interaction10.1145/3527188.3563929(318-320)Online publication date: 5-Dec-2022
  • (2021)Expression of Robot’s Emotion and Intention Utilizing Physical Positioning in ConversationProceedings of the 9th International Conference on Human-Agent Interaction10.1145/3472307.3484162(13-20)Online publication date: 9-Nov-2021
  • (2021)Virtual Shadow Rendering for Maintaining Situation Awareness in Proximal Human-Robot TeamingCompanion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3434074.3447221(494-498)Online publication date: 8-Mar-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media