Abstract
The capability of differentiating error situations from error-free situations in human-robot collaboration is a mandatory skill for collaborative robots. One of the variables that robots can analyse to differentiate both situations are the social signals from the human interaction partner. We performed an extensive human-robot collaboration user study involving 50 participants in which the robot purposefully executed erroneous behaviours. We annotated the occurrences and the duration of multimodal social signals from the participants during both error-free situations and error situations using an automatic video annotation method based on OpenFace. An analysis of the annotation shows that the participants express more facial expressions, head gestures, and gaze shifts during erroneous situations than in error-free situations. The duration of the facial expressions and gaze shifts is also longer during error situations. Our results additionally show that people look at the robot and the table with a longer duration and look at the objects with a shorter duration in error situations compared to error-free situations. The results of this research are essential for the development of automatic error recognition and error handling in human-robot collaboration.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Baltrusaitis, T., Zadeh, A., Lim, Y.C., Morency, L.P.: OpenFace 2.0: facial behavior analysis toolkit. In: Proceedings - 13th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2018, pp. 59–66. Institute of Electrical and Electronics Engineers Inc. (2018)
Cahya, D., Giuliani, M.: Towards a cognitive architecture incorporating human feedback for interactive collaborative robots. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), LNAI, vol. 10965, pp. 486–488 (2018)
Ekman, P., Friesen, W.V.: Facial Action Coding System - The Manual, vol. 160. Research Nexus division of Network Information Research Corporation (2002)
Giuliani, M., Mirnig, N., Stollnberger, G., Stadler, S., Buchner, R., Tscheligi, M.: Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations. Front. Psychol. 6(July), 931 (2015)
Hagenaars, M.A., Oitzl, M., Roelofs, K.: Updating freeze: aligning animal and human research (2014). https://doi.org/10.1016/j.neubiorev.2014.07.021
Hamacher, A., Bianchi-Berthouze, N., Pipe, A.G., Eder, K.: Believing in BERT: using expressive communication to enhance trust and counteract operational error in physical Human-robot interaction. In: 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, pp. 493–500. Institute of Electrical and Electronics Engineers Inc. (2016)
Mann, S.: Humanistic computing: “WearComp” as a new framework and application for intelligent signal processing. Proc. IEEE 86(11), 2123–2151 (1998). https://doi.org/10.1109/5.726784
Mirnig, N., Giuliani, M., Stollnberger, G., Stadler, S., Buchner, R., Tscheligi, M.: Impact of robot actions on social signals and reaction times in HRI error situations. In: Tapus, A., André, E., Martin, J.C., Ferland, F., Ammi, M. (eds.) Social Robotics. LNCS (LNAI), vol. 9388, pp. 461–471. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25554-5_46
Mirnig, N., Stollnberger, G., Miksch, M., Stadler, S., Giuliani, M., Tscheligi, M.: To err is robot: how humans assess and act toward an erroneous social robot. Front. Robot. AI 4(May), 1–15 (2017)
Rasmussen, J.: Human errors. A taxonomy for describing human malfunction in industrial installations. J. Occup. Accid. 4(2–4), 311–333 (1982)
Salem, M., Lakatos, M., Amirabdollahian, F., Dautenhahn, K.: Would you trust a (faulty) robot?: effects of error, task type and personality on human-robot cooperation and trust. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 141–148 (2015)
Trung, P., et al.: Head and shoulders: automatic error detection in human-robot interaction. In: Proceedings of the 19th ACM International Conference on Multimodal Interaction - ICMI 2017, pp. 181–188 (2017)
Acknowledgments
The first author acknowledges the scholarship support from the Ministry of Research and Higher Education (KEMENRISTEKDIKTI) of Republic of Indonesia through the Research and Innovation in Science and Technology (RISET-Pro) Program (World Bank Loan No. 8245-ID).
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Cahya, D.E., Ramakrishnan, R., Giuliani, M. (2019). Static and Temporal Differences in Social Signals Between Error-Free and Erroneous Situations in Human-Robot Collaboration. In: Salichs, M., et al. Social Robotics. ICSR 2019. Lecture Notes in Computer Science(), vol 11876. Springer, Cham. https://doi.org/10.1007/978-3-030-35888-4_18
Download citation
DOI: https://doi.org/10.1007/978-3-030-35888-4_18
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-35887-7
Online ISBN: 978-3-030-35888-4
eBook Packages: Computer ScienceComputer Science (R0)