Skip to main content
Log in

Impacts of Visual Occlusion and Its Resolution in Robot-Mediated Social Collaborations

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

In this work, we contribute to the current understanding of human behaviors in telepresence when visual occlusions are introduced in a remote collaboration context. Occlusions can occur when users in remote locations are engaged in physical collaborative tasks. This can yield to frustration and inefficient collaboration between the collaborators. In this work, we aim to design a better user interface to improve remote collaboration experience. We conducted two human-subjects experiments to investigate the following interlinked research questions: (a) what are the impacts of occlusion on remote collaborations, and (b) can an autonomous handling of occlusions improve telepresence collaboration experience for remote users? Results from our preliminary experiment demonstrate that occlusions introduce a significant social interference that necessitates collaborators to reorient or reposition themselves. Subsequently, we conducted a main experiment to evaluate the efficacy of autonomous occlusion handling for remote users. Results from this experiment indicate that the use of an autonomous controller yields a remote user experience that is more comparable (in terms of their vocal non-verbal behaviors [“The vocal non-verbal behaviour includes all spoken cues that surround the verbal message and influence its actual meaning.” (Vinciarelli in Image Vis Comput 27(12):1743–1759, 2009)], task performance and perceived workload) to collaborations performed by two co-located parties. Finally, we discuss the implications of a better controller design for similar robot-mediated social interactions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. Motor behavior is the study of how people learn, control and develop their motor skills as they experience a physical activity.

  2. Sense of presence measures one’s perception of others and the environment when experienced through a communication medium. We considered three aspects of presence (spatial presence, engagement and social presence) in this work based on the designed task. More detail is provided in Sect. 4.4.

  3. \(\hbox {Skype}^{\mathrm{TM}}\), “\(\hbox {Skype}^{\mathrm{TM}}\) is a trade mark of Skype and our research is not affiliated, sponsored, authorised or otherwise associated by/with the Skype group of companies.”

  4. Xbox®, “Xbox is either a registered trademark or trademark of Microsoft Corporation in the United States and/or other countries.”

  5. \(\hbox {WAM}^{\mathrm{TM}}\) arm, Barrett Technology, LLC “robotic arm for industrial purposes.”

  6. Leap MotionTM is a trademark of Leap Motion, Inc.

  7. Available at http://wiki.ros.org/openni_tracker.

  8. AR-Tag is a fiduciary marker system to support augmented reality [38]. The AR-Tag tracking library is available at: http://wiki.ros.org/ar_track_alvar.

References

  1. Kristoffersson A, Coradeschi S, Loutfi A (2013) A review of mobile robotic telepresence. Adv Hum Comput Interact 2013:1–17

    Article  Google Scholar 

  2. Tsui KM, Desai M, Yanco HA (2011) Exploring use cases for telepresence robots. In: Proceedings of the 6th international conference on Human–robot interaction, pp 11–18

  3. Giannoulis G, Coliou M, Kamarinos G, Roussou M, Trahanias P, Argyros A, Tsakiris D, Cremers A, Haehnel D, Burgard W, Savvaides V, Friess P, Konstantios D, Katselaki A (2001) Enhancing museum visitor access thorugh robotic avatars connected to the web. In: Proceedings museums and the web, pp 37–46

  4. InTouch Health (2013) InTouch health receives FDA clearance for the RP-VITA remote presence robot, press release [Online]. http://www.intouchhealth.com/in-the-news/press-releases/01-08-2013/. Accessed 07 Oct 2015

  5. Lee MK, Takayama L (2011) Now, i have a body: uses and social norms for mobile remote presence in the workplace. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 33–42

  6. Schmidt RC, Richardson MJ (2008) Dynamics of interpersonal coordination. Understanding complex systems. Springer, Berlin, pp 281–308

    Google Scholar 

  7. Breazeal C, Gray J, Berlin M (2009) An embodied cognition approach to mindreading skills for socially intelligent robots. Int J Robot Res 28(5):656–680

    Article  Google Scholar 

  8. Mead R (2012) Space, speech, and gesture in human–robot interaction. In: Proceedings of the 14th ACM international conference on multimodal interaction-ICMI’12, p 333

  9. van Dijk B, Zwiers J, op den Akker R, Kulyk O, Hondorp H, Hofs D, Nijholt A (2011) Conveying directional gaze cues to support remote participation in hybrid meetings. In: Esposito A, Esposito AM, Martone R, Müller VC, Scarpetta G (eds) Proceedings of the Third COST 2102 international training school conference on toward autonomous, adaptive, and context-aware multimodal interfaces: theoretical and practical issues, pp. 412–428

  10. Bondareva Y, Bouwhuis DG (2004) Determinants of social presence in videoconferencing. In: Proceedings of the workshop on environments for personalized information access, pp 1–9

  11. Cosgun A, Florencio DA, Christensen HI (2013) Autonomous person following for telepresence robots. In: Proceedings of the IEEE international conference on robotics and automation, pp 4335–4342

  12. Radmard S, Croft EA (2013) Overcoming occlusions in semi-autonomous telepresence systems. In: 16th international conference on advanced robotics, pp 1–6

  13. Pang WC, Seet G, Yao X (2014) A study on high-level autonomous navigational behaviors for telepresence applications. Presence Teleoper Virtual Environ 23(2):155–171

    Article  Google Scholar 

  14. Nakanishi H, Murakami Y, Nogami D, Ishiguro H (2008) Minimum movement matters: impact of robot-mounted cameras on social telepresence. In: Computer-supported cooperative work and social computing, pp 303–312

  15. Hall ET (1966) The hidden dimension: man’s use of space in public and private. Bodley Head, London

    Google Scholar 

  16. Kendon A (1990) Spatial organization in social encounters: the F-formation system. In: Conducting interaction: patterns of behavior in focused encounters. Cambridge University Press, New York

  17. Huettenrauch H, Eklundh KS, Green A, Topp EA (2006) Investigating spatial relationships in human–robot interaction. In: IEEE international conference on intelligent robots and systems, pp 5052–5059

  18. Kuzuoka H, Suzuki Y, Yamashita J, Yamazaki K (2010) Reconfiguring spatial formation arrangement by robot body orientation. In: Proceedings of the 5th ACM/IEEE international conference on human–robot interaction, pp 285–292

  19. Mead R, Atrash A, Matarić MJ (2013) Automated proxemic feature extraction and behavior recognition: applications in human–robot interaction. Int J Soc Robot 5(3):367–378

    Article  Google Scholar 

  20. Rae I, Takayama L, Mutlu B (2013) The influence of height in robot-mediated communication. In: Proceedings of the 8th ACM/IEEE international conference on human–robot interaction, pp 1–8

  21. Yan R, Tee KP, Chua Y, Huang Z (2013) A user study for an attention-directed robot for telepresence. In: 11th international conference on smart homes and health telematics, pp 110–117

  22. Sirkin D, Venolia G, Tang J, Robertson G, Kim T, Inkpen K, Sedlins M, Lee B, Sinclair M (2011) Motion and attention in a kinetic videoconferencing proxy, vol 6946. Lecture notes in computer science (including its subseries lecture notes in artificial intelligence and lecture notes in bioinformatics). pp 162–180

  23. Adalgeirsson SO, Breazeal C (2010) MeBot a robotic platform for socially embodied telepresence. In: Proceedings of 5th ACM/IEEE international conference on human–robot interaction, no 2007, pp 15–22

  24. Cabibihan JJ, So WC, Saj S, Zhang Z (2012) Telerobotic pointing gestures shape human spatial cognition. Int J Soc Robot 4(3):263–272

    Article  Google Scholar 

  25. Kristoffersson A, Coradeschi S, Loutfi A, Severinson-Eklundh K (2014) Assessment of interaction quality in mobile robotic telepresence: an elderly perspective. Interact Stud 15(2):343–357

    Article  Google Scholar 

  26. Rae I, Mutlu B, Takayama L (2014) Bodies in motion: mobility, presence, and task awareness in telepresence. In: Proceedings of the 32nd annual ACM conference on human factors in computing systems, pp 2153–2162

  27. Kristoffersson A, Severinson Eklundh K, Loutfi A (2012) Measuring the quality of interaction in mobile robotic telepresence: a pilot’s perspective. Int J Soc Robot 5(1):89–101

    Article  Google Scholar 

  28. Radmard S, Moon A, Croft EA (2015) Interface design and usability analysis for a robotic telepresence platform. In: IEEE RO-MAN: the 24th IEEE international symposium on robot and human interactive communication, pp 511–516

  29. Cohen B, Lanir J, Stone R, Gurevich P (2011) Requirements and design considerations for a fully immersive robotic telepresence system. In: The 6th ACM/IEEE conference on human–robot interaction, social robotic telepresence workshop, pp 16–22

  30. Hernoux F, Béarée R, Gajny L, Nyiri E, Bancalin J, Gibaru O (2013) Leap Motion pour la capture de mouvement 3D par spline L1—Application à la robotique, Conférence Groupe Travail en Modélisation Géométrique, pp 1–6

  31. Bassily D, Georgoulas C, Güttler J, Linner T, Bock T, München TU (2014) Intuitive and adaptive robotic arm manipulation using the leap motion controller. In: ISR Robotik, pp 78–84

  32. Guo C, Sharlin E (2008) Exploring the use of tangible user interfaces for human–robot interaction: a comparative study. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 121–130

  33. Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Hancock PA, Meshkati N (eds) Human mental workload, vol 52. North-Holland, pp 139–183

  34. Brooke J (1996) SUS—a quick and dirty usability scale. In: Jordan PW, Thomas B, Weerdmeester BA, McClelland AL (eds) Usability evaluation in industry. Taylor and Francis, London, pp 189–194

    Google Scholar 

  35. Du G, Zhang P, Mai J, Li Z (2012) Markerless Kinect-based hand tracking for robot teleoperation. Int J Adv Robot Syst 9(36):1–10

    Google Scholar 

  36. Obaid M, Kistler F, Häring M, Bühling R, André E (2014) A framework for user-defined body gestures to control a humanoid robot. Int J Soc Robot 6(3):383–396

    Article  Google Scholar 

  37. Radmard S, Meger D, Croft EA, Little JJ (2013) Overcoming unknown occlusions in eye-in-hand visual search. In: Proceedings of the IEEE international conference on robotics and automation, pp 3075–3082

  38. ARTag, Wikipedia, the free Encyclopedia [Online]. https://en.wikipedia.org/wiki/ARTag. Accessed 07 Oct 2015

  39. Radmard S, Croft EA (2015) Autonomous robots active target search for high dimensional robotic systems. Auton Robots 41:163–180

    Article  Google Scholar 

  40. Kristoffersson A, Coradeschi S, Eklundh KS, Loutfi A (2013) Towards measuring quality of interaction in mobile robotic telepresence using sociometric badges. Paladyn J Behav Robot 4(1):34–48

    Google Scholar 

  41. Vinciarelli A, Pantic M, Bourlard H (2009) Social signal processing: survey of an emerging domain. Image Vis Comput 27(12):1743–1759

    Article  Google Scholar 

  42. Lombard M, Ditton TB, Weinstein L (2009) measuring presence?: the temple presence inventory. In: 12th annual international workshop on presence

  43. Walker AM, Miller DP, Ling C (2013) Spatial orientation aware smartphones for tele-operated robot control in military environments: a usability experiment. Proc Hum Factors Ergon Soc Annu Meet 57(1):2027–2031

    Article  Google Scholar 

  44. Adams JA, Kaymaz-Keskinpala H (2004) Analysis of perceived workload when using a PDA for mobile robot teleoperation. In: Proceedings of the IEEE international conference on robotics and automation, pp 4128–4133

  45. Kiselev A, Loutfi A (2012) Using a mental workload index as a measure of usability of a user interface for social robotic telepresence. In: Ro-man workshop on social robotic telepresence, pp 3–6

  46. R Core Team (2015) R: a language and environment for statistical computing [Online]. http://www.r-project.org/. Accessed 07 Oct 2015

  47. Walker S, Bates D, Maechler M, Bolker B (2014) lme4: linear mixed-effects models using eigen and S4 [Online]. http://cran.r-project.org/package=lme4. Accessed 07 Oct 2015

  48. Kramer ADI, Oh LM, Fussell SR (2006) Using linguistic features to measure presence in computer-mediated communication. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 913–916

  49. Hung H, Gatica-Perez D (2010) Estimating cohesion in small groups using audio-visual nonverbal behavior. IEEE Trans Multimed 12(6):563–575

    Article  Google Scholar 

  50. Curhan JR, Pentland A (2007) Thin slices of negotiation: predicting outcomes from conversational dynamics within the first 5 minutes. J Appl Psychol 92(3):802–811

    Article  Google Scholar 

  51. Kim T Chang A, Holland L, Pentland AS (2008) Meeting mediator: enhancing group collaboration using sociometric feedback. In: Proceedings of the 2008 ACM conference on computer supported cooperative work, pp 457–466

  52. Mead R, Matari MJ (2014) Probabilistic models of proxemics for spatially situated communication in HRI. In: Proceedings of the 9th ACM/IEEE international conference on human–robot interaction, algorithmic human–robot interaction workshop, pp 3–9

Download references

Acknowledgements

We would like to thank all of the participants who participated in our study and Joost Hilte, Louisa Hardjasa, Alex Toews and Cole Shing in particular who supported us in the experiments. This research was funded by Natural Science and Engineering Research Council of Canada, the Canada Foundation for Innovation and the UBC Institute for Computing, Information and Cognitive Systems.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sina Radmard.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Radmard, S., Moon, A. & Croft, E.A. Impacts of Visual Occlusion and Its Resolution in Robot-Mediated Social Collaborations. Int J of Soc Robotics 11, 105–121 (2019). https://doi.org/10.1007/s12369-018-0480-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-018-0480-9

Keywords

Navigation