skip to main content
10.1145/3371382.3378319acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Perception of Emotional Gait-like Motion of Mobile Humanoid Robot Using Vertical Oscillation

Published: 01 April 2020 Publication History

Abstract

This paper presents a subjective evaluation of the emotions of a wheeled mobile humanoid robot expressing emotions during movement by replicating human gait-induced upper body motion. For this purpose, we proposed the robot equipped with a vertical oscillation mechanism that generates such motion by focusing on human center-of-mass trajectory. In the experiment, participants watched videos of the robot's different emotional gait-induced upper body motions, and assess the type of emotion shown, and their confidence level in their answer. We calculated the emotion recognition rate and the average confidence level of their answers. As a result, we found that participants gave higher confidence levels in their assessment for the robot's emotional movement with vertical oscillation than one without it.

References

[1]
Hillel Aviezer, Yaacov Trope, and Alexander Todorov. 2012. Body Cues, Not Facial Expressions, Discriminate Between Intense Positive and Negative Emotions. Science 338, 6111 (Nov. 2012), 1225--1229. https://doi.org/10.1126/science.1224313
[2]
Justin Carpentier, Mehdi Benallegue, and Jean-Paul Laumond. 2017. On the centre of mass motion in human walking. International Journal of Automation and Computing 14, 5 (Oct. 2017), 542--551. https://doi.org/10.1007/s11633-017--1088--5
[3]
Matthieu Destephe. 2015. Analysis, Modeling and Application of Emotional Gait to Humanoid Robots. Doctoral dissertation. Waseda University Repository, Japan. (Accession No. 6930).
[4]
Matthieu Destephe, Martim Brandao, Tatsuhiro Kishi, Massimiliano Zecca, Kenji Hashimoto, and Atsuo Takanishi. 2014. Emotional gait: effects on humans' perception of humanoid robots. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication. IEEE, Edinburgh, UK, 261--266. https://doi.org/10.1109/ROMAN.2014.6926263
[5]
Matthieu Destephe, Kenji Hashimoto, and Atsuo Takanishi. 2013. Emotional gait generation method based on emotion mental model - Preliminary experiment with happiness and sadness. In 2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2013. IEEE, Jeju, Korea, 86--89. https: //doi.org/10.1109/URAI.2013.6677480
[6]
Matthieu Destephe, Takayuki Maruyama, Massimiliano Zecca, Kenji Hashimoto, and Atsuo Takanishi. 2013. The influences of emotional intensity for happiness and sadness on walking. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, Osaka, Japan, 7452--7455. https://doi.org/10.1109/embc.2013.6611281
[7]
Anna Frohnwieser, Richard Hopf, and Elisabeth Oberzaucher. 2013. Human walking behavior: the effect of pedestrian flow and personal space invasions on walking speed and direction. Human Ethology Bulletin 28, 3 (2013), 20--28.
[8]
Diego F. Paez Granados and Kazuhiro Kosuge. 2015. Design of a Male-type Dance Partner Robot for leading a physical Human-Robot Interaction. In 2015 IEEE International Conference on Mechatronics and Automation (ICMA). IEEE, Beijing, China, 1234--1240. https://doi.org/10.1109/ICMA.2015.7237662
[9]
Madison Heimerdinger and Amy LaViers. 2019. Modeling the interactions of context and style on affect in motion perception: stylized gaits across multiple environmental contexts. International Journal of Social Robotics (2019), 1--19.
[10]
Takamune Izui, Isabelle Milleville, Sophie Sakka, and Gentiane Venture. 2015. Expressing emotions using gait of humanoid robot. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, Kobe, Japan, 241--245. https://doi.org/10.1109/ROMAN.2015. 7333614
[11]
Michelle Karg, Kolja Kuhnlenz, and Martin Buss. 2010. Recognition of Affect Based on Gait Patterns. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 40, 4 (Aug. 2010), 1050--1061. https://doi.org/10.1109/TSMCB. 2010.2044040
[12]
Michelle Karg, Ali-Akbar Samadani, Rob Gorbet, Kolja Kühnlenz, Jesse Hoey, and Dana Kuli?. 2013. Body Movements for Affective Expression: A Survey of Automatic Recognition and Generation. IEEE Transactions on Affective Computing 4, 4 (Nov. 2013), 341--359. https://doi.org/10.1109/T-AFFC.2013.29
[13]
Jan Kedzierski, Robert Muszy'ski, Carsten Zoll, Adam Oleksy, and Mirela Frontkiewicz. 2013. EMYS--Emotive Head of a Social Robot. International Journal of Social Robotics 5, 2 (April 2013), 237--249. https://doi.org/10.1007/s12369-013-0183--1
[14]
Johannes Michalak, Nikolaus Troje, Julia Fischer, Patrick Vollmar, Thomas Heidenreich, and Dietmar Schulte. 2009. Embodiment of Sadness and Depression--Gait Patterns Associated With Dysphoric Mood. Psychosomatic Medicine 71, 5 (June 2009), 580--587. https://doi.org/10.1097/PSY.0b013e3181a2515c
[15]
Kanako Miura, Mitsuharu Morisawa, Fumio Kanehiro, Shuuji Kajita, Kenji Kaneko, and Kazuhito Yokoi. 2011. Human-like walking with toe supporting for humanoids. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, San Francisco, CA, USA, 4428--4435. https://doi.org/10.1109/IROS.2011.6094838
[16]
Joann M. Montepare, Sabra B. Goldstein, and Annmarie Clausen. 1987. The identification of emotions from gait information. Journal of Nonverbal Behavior 11, 1 (March 1987), 33--42. https://doi.org/10.1007/BF00999605
[17]
Toru Nakata, Tomomasa Sato, Taketoshi Mori, and Hiroshi Mizoguchi. 1998. Expression of emotion and intention by robot body movement. In Proceedings of the 5th international conference on autonomous systems. IOS Press, 352--359.
[18]
Yusuke Okuno, Takayuki Kanda, Michita Imai, Hiroshi Ishiguro, and Norihiro Hagita. 2009. Providing route directions: design of robot's utterance, gesture, and timing. In Proceedings of the 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, La Jolla, CA, USA, 53--60. https://doi.org/ 10.1145/1514095.1514108
[19]
Michael S. Orendurff, Ava D. Segal, Glenn K. Klute, Jocelyn S. Berge, Eric S. Rohr, and Nancy J. Kadel. 2004. The effect of walking speed on center of mass displacement. Journal of Rehabilitation Research & Development 41, 6A (Nov--Dec 2004), 829--834.
[20]
Amit K. Pandey and Rodolphe Gelin. 2018. A Mass-Produced Sociable Humanoid Robot: Pepper: The First Machine of Its Kind. IEEE Robotics Automation Magazine 25, 3 (July 2018), 40--48. https://doi.org/10.1109/MRA.2018.2833157
[21]
Jin Hyoung Park, Francisco A. Rojas, and Hyun Seung Yang. 2013. A collision avoidance behavior model for crowd simulation based on psychological findings. Computer Animation and Virtual Worlds 24, 3--4 (May 2013), 173--183. https: //doi.org/10.1002/cav.1504
[22]
Alberto Parmiggiani, Luca Fiorio, Alessandro Scalzo, Anand V. Sureshbabu, Marco Randazzo, Marco Maggiali, Ugo Pattacini, Hagen Lehmann, Vadim Tikhanoff, Daniele Domenichelli, Alberto Cardellino, Pierpaolo Congiu, Andrea Pagnin, Roberto Cingolani, Lorenzo Natale, and Giorgio Metta. 2017. The design and validation of the R1 personal humanoid. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, Vancouver, BC, Canada, 674--680. https://doi.org/10.1109/IROS.2017.8202224
[23]
JST ERATO Ishiguro Symbiotic Human-Robot Interaction Project. 2018. Improvement of humanlike conversations in humanoid robots -- Development of a child-like android with the ability to move --. Retrieved Septemver 28, 2019 from https://www.jst.go.jp/pr/announce/20180731--2/index_e.html
[24]
Claire L. Roether, Lars Omlor, Andrea Christensen, and Martin A. Giese. 2009. Critical features for the perception of emotion from gait. Journal of vision 9, 6 (June 2009), 1--32. https://doi.org/10.1167/9.6.15
[25]
Gentiane Venture, Hideki Kadone, Tianxiang Zhang, Julie Grézes, Alain Berthoz, and Halim Hicheur. 2014. Recognizing emotions conveyed by human gait. International Journal of Social Robotics 6, 4 (Nov. 2014), 621--632. https://doi.org/10. 1007/s12369-014-0243--1
[26]
Satoshi Yagi, Yoshihiro Nakata, Yutaka Nakamura, and Hiroshi Ishiguro. 2018. An investigation of the effect of a swinging upper body of a mobile android on human gait. In Proceedings of the 23rd Robotics Symposia. Yaizu, Japan, 105--108. (in Japanese).
[27]
Wiebren Zijlstra and At L. Hof. 1997. Displacement of the pelvis during human walking: experimental data and model predictions. Gait & Posture 6, 3 (Dec. 1997), 249--262. https://doi.org/10.1016/s0966--6362(97)00021-0

Cited By

View all
  • (2025)Gait-to-Gait Emotional Human–Robot Interaction Utilizing Trajectories-Aware and Skeleton-Graph-Aware Spatial–Temporal TransformerSensors10.3390/s2503073425:3(734)Online publication date: 25-Jan-2025
  • (2024)Human Perception of the Emotional Expressions of Humanoid Robot Body Movements: Evidence from Survey and Eye-Tracking MeasurementsBiomimetics10.3390/biomimetics91106849:11(684)Online publication date: 8-Nov-2024
  • (2024)What Kinds of Facial Self-Touches Strengthen Expressed Emotions?2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731424(446-452)Online publication date: 26-Aug-2024
  • Show More Cited By

Index Terms

  1. Perception of Emotional Gait-like Motion of Mobile Humanoid Robot Using Vertical Oscillation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
    March 2020
    702 pages
    ISBN:9781450370578
    DOI:10.1145/3371382
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 April 2020

    Check for updates

    Author Tags

    1. emotion
    2. gait
    3. human-like motion
    4. humanoid
    5. mobile robot

    Qualifiers

    • Abstract

    Funding Sources

    • JST ERATO
    • Grant-in-Aid for JSPS Fellows

    Conference

    HRI '20
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 192 of 519 submissions, 37%

    Upcoming Conference

    HRI '25
    ACM/IEEE International Conference on Human-Robot Interaction
    March 4 - 6, 2025
    Melbourne , VIC , Australia

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)25
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Gait-to-Gait Emotional Human–Robot Interaction Utilizing Trajectories-Aware and Skeleton-Graph-Aware Spatial–Temporal TransformerSensors10.3390/s2503073425:3(734)Online publication date: 25-Jan-2025
    • (2024)Human Perception of the Emotional Expressions of Humanoid Robot Body Movements: Evidence from Survey and Eye-Tracking MeasurementsBiomimetics10.3390/biomimetics91106849:11(684)Online publication date: 8-Nov-2024
    • (2024)What Kinds of Facial Self-Touches Strengthen Expressed Emotions?2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731424(446-452)Online publication date: 26-Aug-2024
    • (2023)Conventional, Heuristic and Learning-Based Robot Motion Planning: Reviewing Frameworks of Current Practical SignificanceMachines10.3390/machines1107072211:7(722)Online publication date: 7-Jul-2023
    • (2021)Can an android’s posture and movement discriminate against the ambiguous emotion perceived from its facial expressions?PLOS ONE10.1371/journal.pone.025490516:8(e0254905)Online publication date: 10-Aug-2021
    • (2021)Modeling the Timing and Duration of Grip Behavior to Express Emotions for a Social RobotIEEE Robotics and Automation Letters10.1109/LRA.2020.30363726:1(159-166)Online publication date: Jan-2021
    • (2021)Perception of Emotional Expression of Mobile Humanoid Robot Using Gait-Induced Upper Body MotionIEEE Access10.1109/ACCESS.2021.31101609(124793-124804)Online publication date: 2021

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media