Abstract
Expressing social cues through different communication channels plays an important role in mutual understanding, in both human-human and human-robot collaborations. A few studies investigated the effects of zoomorphic and anthropomorphic social cues expressed by industrial robot arms on robot-to-human communication. In this work, we investigate the role of multi-modal social cues by combining the robot’s head-like gestures with light and sound modalities in two studies. The first study found that multi-modal social cues have positive effects on people’s perception of the robot, perceived enjoyment, and intention to use. The second study found that a combination of human-like gestures with light and/or sound modalities could lead to a higher understandability of the robot’s social cues. These findings suggest the use of multi-modal social cues for robots in industrial settings. However, the possible negative impacts when implementing these social cues should be considered e.g. overtrust, and distraction.
Similar content being viewed by others
Data Availability
Data are available from the corresponding author upon reasonable request.
Notes
Note: But it is still less anthropomorphic than a robot with arm(s) and a head that resemble the human body more closely.
Note: The term “head-like gesture” is shortened to “head gestures” in the rest of the paper.
Antropo—Research Platform Demonstration: https://certification.oshwa.org/be000008.html. Demonstration video: https://youtu.be/eAeAdPo-mJE.
Note: This condition means all modalities were enabled. The conditions when head gestures are combined with light feedback (Head+Light) and sound feedback (Head+Sound) are also multi-modal.
References
Banh A, Rea DJ, Young JE, et al (2015) Inspector baxter: the social aspects of integrating a robot as a quality inspector in an assembly line. In: Proceedings of the 3rd international conference on human-agent interaction, pp 19–26
Baraka K, Veloso MM (2018) Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int J Soc Robot 10(1):65–92
Bartneck C, Kulić D, Croft E et al (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1(1):71–81
Bolano G, Roennau A, Dillmann R (2018) Transparent robot behavior by adding intuitive visual and acoustic feedback to motion replanning. In: 2018 27th IEEE International symposium on robot and human interactive communication (RO-MAN), IEEE, pp 1075–1080
Brammer AJ, Laroche C et al (2012) Noise and communication: a three-year update. Noise Health 14(61):281
British Standards Institute (2002) BS EN 60073:2002: Basic and safety principles for man-machine interface, marking and identification. coding principles for indicators and actuators. ISBN: 0580403599
Cao HL, Van de Perre G, Kennedy J et al (2018) A personalized and platform-independent behavior control system for social robots in therapy: development and applications. IEEE Trans Cogn Dev Syst 11(3):334–346
Cha E, Matarić M, Fong T (2016) Nonverbal signaling for non-humanoid robots during human-robot collaboration. In: 2016 11th ACM/IEEE international conference on human-robot interaction (HRI), IEEE, pp 601–602
Colgate JE, Wannasuphoprasit W, Peshkin MA (1996) Cobots: robots for collaboration with human operators. In: Proceedings of the 1996 ASME international mechanical engineering congress and exposition, pp 433–439
El Makrini I, Elprama SA, Van den Bergh J et al (2018) Working with walt: how a cobot was developed and inserted on an auto assembly line. IEEE Robot Autom Mag 25(2):51–58
Elprama S, El Makrini I, Vanderborght B, et al (2016) Acceptance of collaborative robots by factory workers: a pilot study on the importance of social cues of anthropomorphic robots. In: International symposium on robot and human interactive communication, pp 919–924
Embgen S, Luber M, Becker-Asano C, et al (2012) Robot-specific social cues in emotional body language. In: 2012 IEEE RO-MAN: the 21st IEEE international symposium on robot and human interactive communication, pp 1019–1025, https://doi.org/10.1109/ROMAN.2012.6343883
Ende T, Haddadin S, Parusel S, et al (2011) A human-centered approach to robot gesture based communication within collaborative working processes. In: 2011 IEEE/RSJ international conference on intelligent robots and systems, IEEE, pp 3367–3374
Erel H, Trayman D, Levy C et al (2021) Enhancing emotional support: the effect of a robotic object on human-human support quality. Int J Soc Robot 14:257–276
Faibish T, Kshirsagar A, Hoffman G et al (2022) Human preferences for robot eye gaze in human-to-robot handovers. Int J Soc Robot 14:995–1012
Faul F, Erdfelder E, Lang AG et al (2007) G* power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 39(2):175–191
Fischer K (2019) Why collaborative robots must be social (and even emotional) actors. Techné Res Philos Technol 23:270–289
Fischer K, Jensen LC, Kirstein F, et al (2015) The effects of social gaze in human-robot collaborative assembly. In: International conference on social robotics. Springer, pp 204–213
Gleeson B, MacLean K, Haddadi A, et al (2013) Gestures for industry intuitive human-robot communication from human observation. In: 2013 8th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 349–356
Grushko S, Vysockỳ A, Heczko D et al (2021) Intuitive spatial tactile feedback for better awareness about robot trajectory during human-robot collaboration. Sensors 21(17):5748
Heerink M, Kröse B, Evers V et al (2010) Assessing acceptance of assistive social agent technology by older adults: the almere model. Int J Soc Robot 2(4):361–375
Kundu SK, Kumagai S, Sasaki M (2013) A wearable capacitive sensor for monitoring human respiratory rate. Jpn J Appl Phys 52:04CL05. https://doi.org/10.7567/JJAP.52.04CL05
Lazzeri N, Mazzei D, Zaraki A, et al (2013) Towards a believable social robot. In: Conference on biomimetic and biohybrid systems. Springer, pp 393–395
Onnasch L, Maier X, Jürgensohn T (2016) Mensch-Roboter-Interaktion-Eine Taxonomie für alle Anwendungsfälle. Bundesanstalt für Arbeitsschutz und Arbeitsmedizin Dortmund
Park H, Park D, Lee J (2019) How important alarm types for situation awareness at the smart factory? In: International conference on human-computer interaction. Springer, pp 113–118
Ribeiro T, Paiva A (2012) The illusion of robotic life: principles and practices of animation for robots. In: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, pp 383–390
Saldien J, Vanderborght B, Goris K et al (2014) A motion system for social and animated robots. Int J Adv Rob Syst 11(5):72
Sauer V, Sauer A, Mertens A (2021) Zoomorphic gestures for communicating cobot states. IEEE Robot Autom Lett 6(2):2179–2185
Sauppé A, Mutlu B (2014) How social cues shape task coordination and communication. In: Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing, pp 97–108
Sauppé A, Mutlu B (2015) The social impact of a robot co-worker in industrial settings. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, pp 3613–3622
Sebanz N, Bekkering H, Knoblich G (2006) Joint action: bodies and minds moving together. Trends Cogn Sci 10(2):70–76
Sheikholeslami S, Moon A, Croft EA (2017) Cooperative gestures for industry: exploring the efficacy of robot hand configurations in expression of instructional gestures for human-robot interaction. Int J Robot Res 36(5–7):699–720
Song S, Yamada S (2018) Designing expressive lights and in-situ motions for robots to express emotions. In: Proceedings of the 6th international conference on human-agent interaction, pp 222–228
Tang G, Webb P, Thrower J (2019) The development and evaluation of robot light skin: a novel robot signalling system to improve communication in industrial human-robot collaboration. Robot Comput-Integr Manuf 56:85–94
Tatarian K, Stower R, Rudaz D et al (2021) How does modality matter? Investigating the synthesis and effects of multi-modal robot behavior on social intelligence. Int J Soc Robot 14:893–911
Terzioğlu Y, Mutlu B, Şahin E (2020) Designing social cues for collaborative robots: the roie of gaze and breathing in human-robot collaboration. In: 2020 15th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, pp 343–357
Vysocky A, Novak P (2016) Human-robot collaboration in industry. MM Sci J 9(2):903–906
Wurhofer D, Meneweger T, Fuchsberger V, et al (2018) Reflections on operators’ and maintenance engineers’ experiences of smart factories. In: Proceedings of the 2018 ACM conference on supporting groupwork, pp 284–296
Zeylikman S, Widder S, Roncone A, et al (2018) The hrc model set for human-robot collaboration research. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1845–1852
Zheng M, Moon A, Croft EA et al (2015) Impacts of robot head gaze on robot-to-human handovers. Int J Soc Robot 7(5):783–798
Zuckerman O, Walker D, Grishko A, et al (2020) Companionship is not a function: the effect of a novel robotic object on healthy older adults’ feelings of" being-seen”. In: Proceedings of the 2020 CHI conference on human factors in computing systems, pp 1–14
Funding
The work leading to these results has received funding from the European Union’s Horizon 2020 research and innovation program as part of the SOPHIA project under Grant Agreement No. 871237, and the Flemish Government under the program “Onderzoeksprogramma Artificiële Intelligentie (AI) Vlaanderen”.
Author information
Authors and Affiliations
Contributions
All authors contributed to the study conception and design. Material preparation, data collection, and analysis were performed by H-LC, CS, JW, and IEM. The first draft of the manuscript was written by H-LC and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Conflicts of interest
The authors declare that they have no conflict of interest.
Ethical Approval
The authors declare that this research was conducted in full accordance with the Declaration of Helsinki, as well as the ethical guidelines provided by the organization.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Cao, HL., Scholz, C., De Winter, J. et al. Investigating the Role of Multi-modal Social Cues in Human-Robot Collaboration in Industrial Settings. Int J of Soc Robotics 15, 1169–1179 (2023). https://doi.org/10.1007/s12369-023-01018-9
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-023-01018-9