Skip to main content
Log in

Investigating the Role of Multi-modal Social Cues in Human-Robot Collaboration in Industrial Settings

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Expressing social cues through different communication channels plays an important role in mutual understanding, in both human-human and human-robot collaborations. A few studies investigated the effects of zoomorphic and anthropomorphic social cues expressed by industrial robot arms on robot-to-human communication. In this work, we investigate the role of multi-modal social cues by combining the robot’s head-like gestures with light and sound modalities in two studies. The first study found that multi-modal social cues have positive effects on people’s perception of the robot, perceived enjoyment, and intention to use. The second study found that a combination of human-like gestures with light and/or sound modalities could lead to a higher understandability of the robot’s social cues. These findings suggest the use of multi-modal social cues for robots in industrial settings. However, the possible negative impacts when implementing these social cues should be considered e.g. overtrust, and distraction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Data Availability

Data are available from the corresponding author upon reasonable request.

Notes

  1. Note: But it is still less anthropomorphic than a robot with arm(s) and a head that resemble the human body more closely.

  2. Note: The term “head-like gesture” is shortened to “head gestures” in the rest of the paper.

  3. www.franka.de.

  4. Antropo—Research Platform Demonstration: https://certification.oshwa.org/be000008.html. Demonstration video: https://youtu.be/eAeAdPo-mJE.

  5. www.qualtrics.com.

  6. Note: This condition means all modalities were enabled. The conditions when head gestures are combined with light feedback (Head+Light) and sound feedback (Head+Sound) are also multi-modal.

References

  1. Banh A, Rea DJ, Young JE, et al (2015) Inspector baxter: the social aspects of integrating a robot as a quality inspector in an assembly line. In: Proceedings of the 3rd international conference on human-agent interaction, pp 19–26

  2. Baraka K, Veloso MM (2018) Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int J Soc Robot 10(1):65–92

    Article  Google Scholar 

  3. Bartneck C, Kulić D, Croft E et al (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1(1):71–81

    Article  Google Scholar 

  4. Bolano G, Roennau A, Dillmann R (2018) Transparent robot behavior by adding intuitive visual and acoustic feedback to motion replanning. In: 2018 27th IEEE International symposium on robot and human interactive communication (RO-MAN), IEEE, pp 1075–1080

  5. Brammer AJ, Laroche C et al (2012) Noise and communication: a three-year update. Noise Health 14(61):281

    Article  Google Scholar 

  6. British Standards Institute (2002) BS EN 60073:2002: Basic and safety principles for man-machine interface, marking and identification. coding principles for indicators and actuators. ISBN: 0580403599

  7. Cao HL, Van de Perre G, Kennedy J et al (2018) A personalized and platform-independent behavior control system for social robots in therapy: development and applications. IEEE Trans Cogn Dev Syst 11(3):334–346

    Google Scholar 

  8. Cha E, Matarić M, Fong T (2016) Nonverbal signaling for non-humanoid robots during human-robot collaboration. In: 2016 11th ACM/IEEE international conference on human-robot interaction (HRI), IEEE, pp 601–602

  9. Colgate JE, Wannasuphoprasit W, Peshkin MA (1996) Cobots: robots for collaboration with human operators. In: Proceedings of the 1996 ASME international mechanical engineering congress and exposition, pp 433–439

  10. El Makrini I, Elprama SA, Van den Bergh J et al (2018) Working with walt: how a cobot was developed and inserted on an auto assembly line. IEEE Robot Autom Mag 25(2):51–58

    Article  Google Scholar 

  11. Elprama S, El Makrini I, Vanderborght B, et al (2016) Acceptance of collaborative robots by factory workers: a pilot study on the importance of social cues of anthropomorphic robots. In: International symposium on robot and human interactive communication, pp 919–924

  12. Embgen S, Luber M, Becker-Asano C, et al (2012) Robot-specific social cues in emotional body language. In: 2012 IEEE RO-MAN: the 21st IEEE international symposium on robot and human interactive communication, pp 1019–1025, https://doi.org/10.1109/ROMAN.2012.6343883

  13. Ende T, Haddadin S, Parusel S, et al (2011) A human-centered approach to robot gesture based communication within collaborative working processes. In: 2011 IEEE/RSJ international conference on intelligent robots and systems, IEEE, pp 3367–3374

  14. Erel H, Trayman D, Levy C et al (2021) Enhancing emotional support: the effect of a robotic object on human-human support quality. Int J Soc Robot 14:257–276

    Article  Google Scholar 

  15. Faibish T, Kshirsagar A, Hoffman G et al (2022) Human preferences for robot eye gaze in human-to-robot handovers. Int J Soc Robot 14:995–1012

    Article  Google Scholar 

  16. Faul F, Erdfelder E, Lang AG et al (2007) G* power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 39(2):175–191

    Article  Google Scholar 

  17. Fischer K (2019) Why collaborative robots must be social (and even emotional) actors. Techné Res Philos Technol 23:270–289

    Article  Google Scholar 

  18. Fischer K, Jensen LC, Kirstein F, et al (2015) The effects of social gaze in human-robot collaborative assembly. In: International conference on social robotics. Springer, pp 204–213

  19. Gleeson B, MacLean K, Haddadi A, et al (2013) Gestures for industry intuitive human-robot communication from human observation. In: 2013 8th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 349–356

  20. Grushko S, Vysockỳ A, Heczko D et al (2021) Intuitive spatial tactile feedback for better awareness about robot trajectory during human-robot collaboration. Sensors 21(17):5748

    Article  Google Scholar 

  21. Heerink M, Kröse B, Evers V et al (2010) Assessing acceptance of assistive social agent technology by older adults: the almere model. Int J Soc Robot 2(4):361–375

    Article  Google Scholar 

  22. Kundu SK, Kumagai S, Sasaki M (2013) A wearable capacitive sensor for monitoring human respiratory rate. Jpn J Appl Phys 52:04CL05. https://doi.org/10.7567/JJAP.52.04CL05

    Article  Google Scholar 

  23. Lazzeri N, Mazzei D, Zaraki A, et al (2013) Towards a believable social robot. In: Conference on biomimetic and biohybrid systems. Springer, pp 393–395

  24. Onnasch L, Maier X, Jürgensohn T (2016) Mensch-Roboter-Interaktion-Eine Taxonomie für alle Anwendungsfälle. Bundesanstalt für Arbeitsschutz und Arbeitsmedizin Dortmund

  25. Park H, Park D, Lee J (2019) How important alarm types for situation awareness at the smart factory? In: International conference on human-computer interaction. Springer, pp 113–118

  26. Ribeiro T, Paiva A (2012) The illusion of robotic life: principles and practices of animation for robots. In: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, pp 383–390

  27. Saldien J, Vanderborght B, Goris K et al (2014) A motion system for social and animated robots. Int J Adv Rob Syst 11(5):72

    Article  Google Scholar 

  28. Sauer V, Sauer A, Mertens A (2021) Zoomorphic gestures for communicating cobot states. IEEE Robot Autom Lett 6(2):2179–2185

    Article  Google Scholar 

  29. Sauppé A, Mutlu B (2014) How social cues shape task coordination and communication. In: Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing, pp 97–108

  30. Sauppé A, Mutlu B (2015) The social impact of a robot co-worker in industrial settings. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, pp 3613–3622

  31. Sebanz N, Bekkering H, Knoblich G (2006) Joint action: bodies and minds moving together. Trends Cogn Sci 10(2):70–76

    Article  Google Scholar 

  32. Sheikholeslami S, Moon A, Croft EA (2017) Cooperative gestures for industry: exploring the efficacy of robot hand configurations in expression of instructional gestures for human-robot interaction. Int J Robot Res 36(5–7):699–720

    Article  Google Scholar 

  33. Song S, Yamada S (2018) Designing expressive lights and in-situ motions for robots to express emotions. In: Proceedings of the 6th international conference on human-agent interaction, pp 222–228

  34. Tang G, Webb P, Thrower J (2019) The development and evaluation of robot light skin: a novel robot signalling system to improve communication in industrial human-robot collaboration. Robot Comput-Integr Manuf 56:85–94

    Article  Google Scholar 

  35. Tatarian K, Stower R, Rudaz D et al (2021) How does modality matter? Investigating the synthesis and effects of multi-modal robot behavior on social intelligence. Int J Soc Robot 14:893–911

    Article  Google Scholar 

  36. Terzioğlu Y, Mutlu B, Şahin E (2020) Designing social cues for collaborative robots: the roie of gaze and breathing in human-robot collaboration. In: 2020 15th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, pp 343–357

  37. Vysocky A, Novak P (2016) Human-robot collaboration in industry. MM Sci J 9(2):903–906

    Article  Google Scholar 

  38. Wurhofer D, Meneweger T, Fuchsberger V, et al (2018) Reflections on operators’ and maintenance engineers’ experiences of smart factories. In: Proceedings of the 2018 ACM conference on supporting groupwork, pp 284–296

  39. Zeylikman S, Widder S, Roncone A, et al (2018) The hrc model set for human-robot collaboration research. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1845–1852

  40. Zheng M, Moon A, Croft EA et al (2015) Impacts of robot head gaze on robot-to-human handovers. Int J Soc Robot 7(5):783–798

    Article  Google Scholar 

  41. Zuckerman O, Walker D, Grishko A, et al (2020) Companionship is not a function: the effect of a novel robotic object on healthy older adults’ feelings of" being-seen”. In: Proceedings of the 2020 CHI conference on human factors in computing systems, pp 1–14

Download references

Funding

The work leading to these results has received funding from the European Union’s Horizon 2020 research and innovation program as part of the SOPHIA project under Grant Agreement No. 871237, and the Flemish Government under the program “Onderzoeksprogramma Artificiële Intelligentie (AI) Vlaanderen”.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. Material preparation, data collection, and analysis were performed by H-LC, CS, JW, and IEM. The first draft of the manuscript was written by H-LC and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Hoang-Long Cao.

Ethics declarations

Conflicts of interest

The authors declare that they have no conflict of interest.

Ethical Approval

The authors declare that this research was conducted in full accordance with the Declaration of Helsinki, as well as the ethical guidelines provided by the organization.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cao, HL., Scholz, C., De Winter, J. et al. Investigating the Role of Multi-modal Social Cues in Human-Robot Collaboration in Industrial Settings. Int J of Soc Robotics 15, 1169–1179 (2023). https://doi.org/10.1007/s12369-023-01018-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-023-01018-9

Keywords

Navigation