skip to main content
10.1145/3371382.3378282acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Generation and Evaluation of Audio-Visual Anger Emotional Expression for Android Robot

Published: 01 April 2020 Publication History

Abstract

Recent studies in human-human interaction (HHI) have revealed the propensity of negative emotional expression to initiate affiliating functions that are beneficial to the expresser and also help to foster cordiality and closeness amongst interlocutors. However, efforts in human-robot interaction (HRI) have not attempted to investigate the consequences of expression of negative emotion by robots on HRI. Thus, the background of this study as a first step is to furnish humanoid robots with natural audio-visual anger expression for HRI. Based on the analysis results from a multimodal HHI corpus, we implemented different types of gestures related to anger expressions for humanoid robots and carried-out subjective evaluation of the generated anger expressions. Findings from this study revealed that the semantic context and functional content of anger-based utterances play a significant role in the choice of gesture to accompany such utterance. Our current result shows that "Pointing" gesture is adjudged more appropriate for utterances with "you" and anger-based "questioning" utterances; while "both arms spread" and "both arm swing" gestures were evaluated more appropriated for "declarative" and "disagreement" utterances respectively.

References

[1]
Gerben A van Kleef. Understanding the positive and negative effects of emotional expressions in organizations: Easi does it. Human Relations, 67(9):1145--1164, 2014.
[2]
Gerben A Van Kleef, Carsten KW De Dreu, and Antony SR Manstead. The interpersonal effects of emotions in negotiations: a motivated information processing approach. Journal of personality and social psychology, 87(4):510, 2004.
[3]
Timothy R Campellone and Ann M Kring. Who do you trust? the impact of facial emotion and behaviour on decision making. Cognition & emotion, 27(4):603--620, 2013.
[4]
Sarah BK Von Billerbeck. Whose Peace?: Local Ownership and United Nations Peacekeeping. 2016.
[5]
Bernard Rimé. Emotion elicits the social sharing of emotion: Theory and empirical review. Emotion review, 1(1):60--85, 2009.
[6]
Marcel Zeelenberg, Joop van der Pligt, and Antony SR Manstead. Undoing regret on dutch television: Apologizing for interpersonal regrets involving actions or inactions. Personality and Social Psychology Bulletin, 24(10):1113--1119, 1998.
[7]
Sichao Song and Seiji Yamada. Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pages 2--11, 2017.
[8]
Chaoran Liu, Carlos T Ishi, Hiroshi Ishiguro, and Norihiro Hagita. Generation of nodding, head tilting and gazing for human--robot speech interaction. International Journal of Humanoid Robotics, 10(01):1350009, 2013.
[9]
Carlos T Ishi, Tomo Funayama, Takashi Minato, and Hiroshi Ishiguro. Motion generation in android robots during laughing speech. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 3327--3332, 2016.
[10]
Carlos T Ishi, Takashi Minato, and Hiroshi Ishiguro. Motion analysis in vocalized surprise expressions and motion generation in android robots. IEEE Robotics and Automation Letters, 2(3):1748--1754, 2017.
[11]
Massimiliano Zecca, Nobutsuna Endo, Shimpei Momoki, Kazuko Itoh, and Atsuo Takanishi. Design of the humanoid robot kobian-preliminary analysis of facial and whole body emotion expression capabilities. In Humanoids 2008--8th IEEE-RAS International Conference on Humanoid Robots, pages 487--492, 2008.
[12]
Soujanya Poria, Devamanyu Hazarika, Navonil Majumder, Gautam Naik, Erik Cambria, and Rada Mihalcea. Meld: A multimodal multi-party dataset for emotion recognition in conversations. arXiv preprint arXiv:1810.02508, 2018.
[13]
Senowarsito Senowarsito, Sri Samiati Tarjana, and Joko Nurkamto. Politeness of non-verbal interaction in efl classrooms. PROSIDING PRASASTI, pages 682--688, 2016.
[14]
Paul Bremner and Ute Leonards. Iconic gestures for robot avatars, recognition and integration with speech. Frontiers in psychology, 7:183, 2016.
[15]
Maha Salem, Micheline Ziadee, and Majd Sakr. Effects of politeness and interaction context on perception and experience of hri. In International Conference on Social Robotics, pages 531--541, 2013.
[16]
Carlos T Ishi, Daichi Machiyashiki, Ryusuke Mikata, and Hiroshi Ishiguro. A speech-driven hand gesture generation method and evaluation in android robots. IEEE Robotics and Automation Letters, 3(4):3757--3764, 2018.

Cited By

View all
  • (2023)From vocal prosody to movement prosody, from HRI to understanding humansInteraction Studies. Social Behaviour and Communication in Biological and Artificial SystemsInteraction Studies / Social Behaviour and Communication in Biological and Artificial SystemsInteraction Studies10.1075/is.22010.sca24:1(130-167)Online publication date: 28-Aug-2023
  • (2022)The Design and Observed Effects of Robot-performed Manual Gestures: A Systematic ReviewACM Transactions on Human-Robot Interaction10.1145/354953012:1(1-62)Online publication date: 19-Jul-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
March 2020
702 pages
ISBN:9781450370578
DOI:10.1145/3371382
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 April 2020

Check for updates

Author Tags

  1. anger
  2. emotion
  3. hhi
  4. hri
  5. multimodal

Qualifiers

  • Abstract

Conference

HRI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 192 of 519 submissions, 37%

Upcoming Conference

HRI '25
ACM/IEEE International Conference on Human-Robot Interaction
March 4 - 6, 2025
Melbourne , VIC , Australia

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)From vocal prosody to movement prosody, from HRI to understanding humansInteraction Studies. Social Behaviour and Communication in Biological and Artificial SystemsInteraction Studies / Social Behaviour and Communication in Biological and Artificial SystemsInteraction Studies10.1075/is.22010.sca24:1(130-167)Online publication date: 28-Aug-2023
  • (2022)The Design and Observed Effects of Robot-performed Manual Gestures: A Systematic ReviewACM Transactions on Human-Robot Interaction10.1145/354953012:1(1-62)Online publication date: 19-Jul-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media