Skip to main content
Log in

Investigating People’s Rapport Building and Hindering Behaviors When Working with a Collaborative Robot

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Modern industrial robots are increasingly moving toward collaborating with people on complex tasks as team members, and away from working in isolated cages that are separated from people. Collaborative robots are programmed to use social communication techniques with people, enabling human team members to use their existing inter-personal skills to work with robots, such as speech, gestures, or gaze. Research is increasingly investigating how robots can use higher-level social structures such as team dynamics or conflict resolution. One particularly important aspect of human–human teamwork is rapport building: these are everyday social interactions between people that help to develop professional relationships by establishing trust, confidence, and collegiality, but which are formally peripheral to a task at hand. In this paper, we report on our investigations of how and if people apply similar rapport-building behaviors to robot collaborators. First, we synthesized existing human–human rapport knowledge into an initial human–robot interaction framework; this framework includes verbal and non-verbal behaviors, both for rapport building and rapport hindering, that people can be expected to exhibit. We developed a novel mock industrial task scenario that emphasizes ecological validity, and creates a range of social interactions necessary for investigating rapport. Finally, we report on a qualitative study that investigates how people use rapport hindering or building behaviors in our industrial scenario, which reflects how people may interact with robots in industrial settings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Ädel A (2011) Rapport building in student group work. J Pragmat 43(12):2932–2947

    Article  Google Scholar 

  2. Argyle M (1990) The biological basis of rapport. Psychol Inq 1:297–300

    Article  Google Scholar 

  3. Basow SA, Rubenfeld K (2003) Troubles talk: effects of gender and gender-typing. Sex Roles 48(3–4):183–187

    Article  Google Scholar 

  4. Bernieri FJ, Gillis JS, David JM, Grahe JE (1997) Dyad rapport and the accuracy of its judgmen across situations: a lens model of analysis. J Pers Soc Psychol 71(1):110–129

    Article  Google Scholar 

  5. Bickmore TW, Picard RW (2005) Establishing and maintaining long-term human–computer relationships. ACM Trans Comput Hum Interact 12(2):293–327

    Article  Google Scholar 

  6. Bohus D, Horvitz E (2010) Facilitating multiparty dialog with gaze, gesture, and speech. In: International conference on multimodal interfaces and the workshop on machine learning for multimodal interaction on—ICMI-MLMI ’10, p 1

  7. Bronstein L, Nelson N, Livnat Z, Ben-Ari R (2012) Rapport in negotiation: the contribution of the verbal channel. J Confl Resolut 56(6):1089–1115

    Article  Google Scholar 

  8. Cakmak M, Thomaz AL (2012) Designing robot learners that ask good questions. In: Proceedings of the ACM/IEEE international conference on human–robot interaction, HRI ’12. ACM, p 17

  9. Chao C, Thomaz A (2012) Timing in multimodal turn–taking interactions: control and analysis using timed petri nets. J Hum Robot Interact 1(1):4–25

    Article  Google Scholar 

  10. Driskell T, Blickensderfer EL, Salas E (2012) Is three a crowd? Examining rapport in investigative interviews. Group Dyn Theory Res Pract 17(1):1–13

    Article  Google Scholar 

  11. Eagerly AH (2009) The his and hers of prosocial behavior: an examination of the social psychology of gender. Am Psychol 34(8):644–658

    Google Scholar 

  12. Eyssel F, Hegel F (2012) (S)he’s got the look: gender stereotyping of robots. J Appl Soc Psychol 42(9):2213–2230

    Article  Google Scholar 

  13. Gratch J, Okhmatovskaia A, Lamothe F et al (2006) Virtual rapport. In: Intelligent virtual agents. Springer, Berlin, pp 14–27

  14. Gratch J, Wang N, Gerten J, Fast E, Duffy R (2007) Creating rapport with virtual agents. In: Proceedings of the international conference on intelligent virtual agents, IVA ’07. Springer, Berlin, pp 125–138

  15. Gremler DD, Gwinner KP (2008) Rapport-building behaviors used by retail employees. J Retail 84(3):308–324

    Article  Google Scholar 

  16. Haddadi A, Croft EA, Gleeson BT, MacLean K, Alcazar J (2013) Analysis of task-based gestures in human–robot interaction. In: IEEE international conference on robotics and automation. IEEE, pp 2146–2152

  17. Haferd T (2013) Do I want to work with you in the future? Does status moderate the process by outcome interaction in ongoing workplace relationships? Columbia University

  18. Hawkins KP, Bansal S, Vo NN, Bobick AF (2014) Anticipating human actions for collaboration in the presence of task and sensor uncertainty. In: IEEE international conference on robotics and automation, ICRA ’14, pp 2215–2222

  19. Hayashi K, Sakamoto D, Kanda T et al (2007) Humanoid robots as a passive-social medium. In: Proceeding of the ACM/IEEE international conference on human–robot interaction—HRI ’07. ACM Press, p 137

  20. Hoffman G, Breazeal C (2004) Collaboration in human–robot teams. In: Proceedings of the AIAA intelligent systems technical conference, pp 1–18

  21. Huang C, Mutlu B (2013) Modeling and evaluating narrative gestures for humanlike robots. In: Proceedings of robotics: science and systems, RSS ’13, pp 26–32

  22. Huang L, Morency L, Gratch J (2011) Virtual rapport 2.0. In: Proceedings of ACM international conference on virtual agents. Springer, pp 68–79

  23. Jung MF, Martelaro N, Hinds PJ (2015) Using robots to moderate team conflict. In: Proceedings of ACM/IEEE international conference on human–robot interaction—HRI ’15. ACM, pp 229–236

  24. Kanda T, Shiomi M, Miyashita Z, Ishiguro H, Hagita N (2009) An affective guide robot in a shopping mall. In: Proceedings of the 4th ACM/IEEE international conference on human–robot interaction—HRI ’09. ACM Press, p 173

  25. Kato Y, Kanda T, Ishiguro H (2015) May I help you? Design of human-like polite approaching behavior. In: Proceedings of ACM/IEEE international conference on human–robot interaction—HRI ’15. ACM, pp 35–42

  26. Kay R (2006) Addressing gender diffeences in computer ability attitudes and use: the laptop effect. J Educ Comput Res 34:187–211

    Article  Google Scholar 

  27. Lee M, Forlizzi J, Kiesler S (2012) Personalization in HRI: a longitudinal field experiment. In: Proceedings of the ACM/IEEE international conference on human–robot interaction, HRI ’12, pp 319–326

  28. Melder WA, Truong KP, Uyl MD, Van Leeuwen DA, Neerincx MA, Loos LR, Plum BS (2007) Affective multimodal mirror: sensing and eliciting laughter. In: Proceedings of the international workshop on human-centered multimedia. HCM ’07. ACM, Augsburg, pp 31–40. https://doi.acm.org/10.1145/1290128.1290134

  29. Moon AJ, Parker CAC, Croft EA, Van der Loos HFM (2013) Design and impact of hesitation gestures during human–robot resource conflicts. J Hum Robot Interact 2(3):18–40

    Article  Google Scholar 

  30. Morris M, Venkatesh V, Ackerman P (2006) Gender and age differences in employee decisions about new technology: an extension to the theory of planned behavior. IEEE Trans Eng Manag 51(1):69–84

    Google Scholar 

  31. Morrison RL (2009) Are women tending and befriending in the workplace? Gender differences in the relationship between workplace friendships and organizational outcomes. Sex Roles 60(1–2):1–13

    Article  Google Scholar 

  32. Mulac A, Bradac JJ, Gibbons P (2001) Empirical support for the gender-as-culture hypothesis: an intercultural analysis of male/female language differences. Hum Commun Res 27(1):121–52

    Article  Google Scholar 

  33. Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human–robot conversations. In: Proceedings of ACM/IEEE international conference on human–robot interaction—HRI ’09. ACM, pp 61–69

  34. Niculescu A, Dijk B, Nijholt A, Li H, See SL (2013) Making social robots more attractive: the effects of voice pitch, humor and empathy. Int J Soc Robot 5(2):171–191

    Article  Google Scholar 

  35. Niculescu A, Van Dijk B, Nijholt A, See SL (2011) The influence of voice pitch on the evaluation of a social robot receptionist. In: Proceedings—2011 international conference on user science and engineering, i-USEr 2011, pp 18–23

  36. Nomura T (2014) Differences of expectation of rapport with robots dependent on situations. In: Proceedings of the ACM/IEEE international conference on human–robot interaction, HRI ’14, pp 383–389

  37. Nomura T, Kanda T (2013) Measurement of rapport-expectation with a robot. In: Proceedings of the ACM/IEEE international conference on human–robot interaction, HRI ’13, pp 201–202

  38. Rea DJ, Wang Y, Young JE (2015) Check your stereotypes at the door: an analysis of gender typecasts in social human–robot interaction. In: Proceedings of international conference on social robtoics, ICSR ’15. Springer

  39. Reysen S (2005) Construction of a new scale: the Reysen likability scale. Soc Behav Personal 33(2):201–208

    Article  Google Scholar 

  40. Sakamoto D, Ono T (2006) Sociality of robots: do robots construct or collapse human relations? In: Proceedings of ACM/IEEE international conference on human–robot interaction—HRI ’06. ACM, pp 355–356

  41. Seo SH, Geiskkovitch D, Nakane M, King C, Young JE (2015) Poor thing! would you feel sorry for a simulated robot? In: Proceedings of international conference on human–robot interaction—HRI ’15. ACM, pp 125–132

  42. Seo SH, Gu J, Jeong S et al (2015) Women and men collaborating with robots on assembly lines: designing a novel evaluation scenario for collocated human–robot teamwork. In: Proceedings of ACM international conference on human–agent interaction 2015, HAI ’15. ACM

  43. Shah J, Wiken J, Williams B, Breazeal C (2011) Improved human–robot team performance using chaski, a human-inspired plan execution system. In: Proceedings of the international conference on juman–robot interaction, HRI ’11, pp 29–36

  44. Shibata T, Kawaguchi Y, Wada K (2011) Investigation on people living with seal robot at home. Int J Soc Robot 4(1):53–63

    Article  Google Scholar 

  45. Short E, Hart J, Vu M, Scassellati B (2010) No fair!! an interaction with a cheating robot. In: Proceedings of the ACM/IEEE international conference on human–robot interaction 2010, HRI ’10. IEEE, pp 219–226

  46. Strabala KW, Lee MK, Dragan AD, Forlizzi JL, Srinivasa S, Cakmak M, Micelli V (2013) Towards seamless human-robot handovers. J Hum Robot Interact 2(1):112–132

    Article  Google Scholar 

  47. Sung J, Guo L, Grinter RE, Christensen HI (2007) “My Rambo Roomba Is”: intimate home appliances. In: UbiComp 2007 ubiquitous computing. Springer, pp 145–162

  48. Tickle-Degnen L, Rosenthal R (1990) The nature of rapport and its nonverbal correlates. Psychol Inq 1(4):285–293

    Article  Google Scholar 

  49. Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: a unified view. MIS Q 27(3):425–478

    Article  Google Scholar 

  50. Wang Y, Young JE (2014) Beyond “pink” and “blue”: gendered attitudes towards robots in society. In: Gender and IT appropriation. Science and practice on dialogue-forum for interdisciplinary exchange. European Society for Socially Embedded Technologies, pp 49–59

  51. Young JE, Sung J, Voida A et al (2010) Evaluating human–robot interaction. Int J Soc Robot 3(1):53–67

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stela H. Seo.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Authors retain copyright and grant the International Journal of Social Robotics right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work’s authorship and initial publication in this journal.

Appendix: Abbreviated Rapport Coding Guide

Appendix: Abbreviated Rapport Coding Guide

Verbal rapport-building a way of establishing connections and negotiating relationships—rapport-building language is a type of interactive language whose primary purpose is to increase the social glue between people communicating [1]

Baseline for judging verbal rapport building similar standards as rapport-building in human–human interaction—when interacting with the robot, participants verbally engage in a manner that aims to increase cohesion and build a relationship [1]

Positive (rapport building) standard: participants engage in rapport building talk during the interaction with the robot.

Negative (rapport hindering) standard: participants do not engage in rapport-building talk during the interaction with the robot

Examples for short instances: Code VR#

Examples for short instances: Code NVR#

VR1: complimenting the robot [1]

NVR1: ignore the robot’s politeness: robot thanks participant (i.e. after cleaning, during praise) and participant gives no response

VR2:thanking the robot [1] e.g., in response to praise, at the end of the task

NVR2: ignore the robot’s criticism: robot tells participant they are going very slow, participant does not respond or responds insincerely

VR3: asking the robot questions [2] during the task, or during the break, actively asking the robot questions not directly related to the task—e.g. questions about the robot’s “personal” information

NVR3: limited responses to questions: during the break, participant’s responses to questions are noticeably brief (e.g., only responding yes/no), responses that do not disclose additional personal information

VR4: responding to questions responding to the robot’s questions in full sentences, actively disclosing personal information [2]

NVR4: sarcasm: participant responds to the robot using a sarcastic or insincere tone

VR5: promoting the in-group: speech that references both the participant and the robot—use of the pronoun “we”, “let’s”, [10] use of the robot’s name [2]

 

VR6: mitigating response to criticism [1] genuinely apologizing in response to robot’s criticism

 

VR7: empathetic speech responding to robot’s complaints, concerns with agreement [1, 10] and empathy—i.e. during paycheck complaint

 

VR8: disclosures participant discloses personal information unprompted by the robot—i.e. not in response to a question

 

Non-verbal rapport-building a way of establishing connections and negotiating relationships—non-verbal rapport-building behaviors are whose primary purpose is to increase the social glue between people communicating

Baseline for judging verbal rapport building similar standards as rapport-building behavior in human–human interaction—when interacting with the robot, participants display behaviors that serve to increase rapport

Positive (rapport building) standard: participants engage in rapport building behaviors during the interaction with the robot

Negative (rapport hindering) standard: participants display behaviors that reflect discomfort, distance, and/or disinterest in establishing rapport with the robot

Examples for short instances: Code BR#

Examples for short instances: Code NBR#

BR1: open posture during the break participant displays open posture (e.g., leaning towards the robot, uncrossed arms, direct body orientation) [48]

NBR1: closed posture: displays closed posture (e.g., crossed arms, leaning away from the robot, facing away from the robot)

BR2: facial expression smiling at the robot (e.g., when the robot is speaking), making eye contact while robot is speaking [48]

NBR2: distracted behavior: e.g., looking around the room, checking phone

BR3: active behavioral engagement e.g., laughing, nodding, waving [48]

NBR3: facial expression: e.g., looking down or away from the robot when it is speaking, neutral facial expression when talking with the robot

BR4: physical proximity participant sits close to the robot during break—i.e. right up against the table

NBR4: physical distance: participant sits far away from robot

 

NBR5: testing: participant tries to trick or test the robot—ex. showing the same cloth twice

  1. BR1/NBR1: code open posture once during the break. If the participant changes posture code again
  2. BR2: Code eye contact only when Nao is also focused on the participant
  3. If participant doesn’t respond verbally to thanks, compliment etc. but responds non-verbally (i.e. smiling), code the nonverbal behavior

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Seo, S.H., Griffin, K., Young, J.E. et al. Investigating People’s Rapport Building and Hindering Behaviors When Working with a Collaborative Robot. Int J of Soc Robotics 10, 147–161 (2018). https://doi.org/10.1007/s12369-017-0441-8

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-017-0441-8

Keywords

Navigation