Skip to main content

Static and Temporal Differences in Social Signals Between Error-Free and Erroneous Situations in Human-Robot Collaboration

  • Conference paper
  • First Online:
Social Robotics (ICSR 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11876))

Included in the following conference series:

Abstract

The capability of differentiating error situations from error-free situations in human-robot collaboration is a mandatory skill for collaborative robots. One of the variables that robots can analyse to differentiate both situations are the social signals from the human interaction partner. We performed an extensive human-robot collaboration user study involving 50 participants in which the robot purposefully executed erroneous behaviours. We annotated the occurrences and the duration of multimodal social signals from the participants during both error-free situations and error situations using an automatic video annotation method based on OpenFace. An analysis of the annotation shows that the participants express more facial expressions, head gestures, and gaze shifts during erroneous situations than in error-free situations. The duration of the facial expressions and gaze shifts is also longer during error situations. Our results additionally show that people look at the robot and the table with a longer duration and look at the objects with a shorter duration in error situations compared to error-free situations. The results of this research are essential for the development of automatic error recognition and error handling in human-robot collaboration.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://new.abb.com/products/robotics/industrial-robots/irb-14000-yumi.

  2. 2.

    https://gitlab.uni-koblenz.de/robbie/homer_robot_face-release.

  3. 3.

    https://wiki.ros.org/rosbag.

References

  1. Baltrusaitis, T., Zadeh, A., Lim, Y.C., Morency, L.P.: OpenFace 2.0: facial behavior analysis toolkit. In: Proceedings - 13th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2018, pp. 59–66. Institute of Electrical and Electronics Engineers Inc. (2018)

    Google Scholar 

  2. Cahya, D., Giuliani, M.: Towards a cognitive architecture incorporating human feedback for interactive collaborative robots. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), LNAI, vol. 10965, pp. 486–488 (2018)

    Google Scholar 

  3. Ekman, P., Friesen, W.V.: Facial Action Coding System - The Manual, vol. 160. Research Nexus division of Network Information Research Corporation (2002)

    Google Scholar 

  4. Giuliani, M., Mirnig, N., Stollnberger, G., Stadler, S., Buchner, R., Tscheligi, M.: Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations. Front. Psychol. 6(July), 931 (2015)

    Google Scholar 

  5. Hagenaars, M.A., Oitzl, M., Roelofs, K.: Updating freeze: aligning animal and human research (2014). https://doi.org/10.1016/j.neubiorev.2014.07.021

    Article  Google Scholar 

  6. Hamacher, A., Bianchi-Berthouze, N., Pipe, A.G., Eder, K.: Believing in BERT: using expressive communication to enhance trust and counteract operational error in physical Human-robot interaction. In: 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, pp. 493–500. Institute of Electrical and Electronics Engineers Inc. (2016)

    Google Scholar 

  7. Mann, S.: Humanistic computing: “WearComp” as a new framework and application for intelligent signal processing. Proc. IEEE 86(11), 2123–2151 (1998). https://doi.org/10.1109/5.726784

    Article  Google Scholar 

  8. Mirnig, N., Giuliani, M., Stollnberger, G., Stadler, S., Buchner, R., Tscheligi, M.: Impact of robot actions on social signals and reaction times in HRI error situations. In: Tapus, A., André, E., Martin, J.C., Ferland, F., Ammi, M. (eds.) Social Robotics. LNCS (LNAI), vol. 9388, pp. 461–471. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25554-5_46

    Chapter  Google Scholar 

  9. Mirnig, N., Stollnberger, G., Miksch, M., Stadler, S., Giuliani, M., Tscheligi, M.: To err is robot: how humans assess and act toward an erroneous social robot. Front. Robot. AI 4(May), 1–15 (2017)

    Google Scholar 

  10. Rasmussen, J.: Human errors. A taxonomy for describing human malfunction in industrial installations. J. Occup. Accid. 4(2–4), 311–333 (1982)

    Article  Google Scholar 

  11. Salem, M., Lakatos, M., Amirabdollahian, F., Dautenhahn, K.: Would you trust a (faulty) robot?: effects of error, task type and personality on human-robot cooperation and trust. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 141–148 (2015)

    Google Scholar 

  12. Trung, P., et al.: Head and shoulders: automatic error detection in human-robot interaction. In: Proceedings of the 19th ACM International Conference on Multimodal Interaction - ICMI 2017, pp. 181–188 (2017)

    Google Scholar 

Download references

Acknowledgments

The first author acknowledges the scholarship support from the Ministry of Research and Higher Education (KEMENRISTEKDIKTI) of Republic of Indonesia through the Research and Innovation in Science and Technology (RISET-Pro) Program (World Bank Loan No. 8245-ID).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Dito Eka Cahya , Rahul Ramakrishnan or Manuel Giuliani .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cahya, D.E., Ramakrishnan, R., Giuliani, M. (2019). Static and Temporal Differences in Social Signals Between Error-Free and Erroneous Situations in Human-Robot Collaboration. In: Salichs, M., et al. Social Robotics. ICSR 2019. Lecture Notes in Computer Science(), vol 11876. Springer, Cham. https://doi.org/10.1007/978-3-030-35888-4_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-35888-4_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-35887-7

  • Online ISBN: 978-3-030-35888-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics