skip to main content
10.1145/3317959.3321492acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Gaze awareness improves collaboration efficiency in a collaborative assembly task

Authors Info & Claims
Published:25 June 2019Publication History

ABSTRACT

In building human robot interaction systems, it would be helpful to understand how humans collaborate, and in particular, how humans use others' gaze behavior to estimate their intent. Here we studied the use of gaze in a collaborative assembly task, where a human user assembled an object with the assistance of a human helper. We found that the being aware of the partner's gaze significantly improved collaboration efficiency. Task completion times were much shorter when gaze communication was available, than when it was blocked. In addition, we found that the user's gaze was more likely to lie on the object of interest in the gaze-aware case than the gaze-blocked case. In the context of human-robot collaboration systems, our results suggest that gaze data in the period surrounding verbal requests will be more informative and can be used to predict the target object.

References

  1. Deepak Akkil and Poika Isokoski. 2018. I see what you see: gaze awareness in mobile video collaboration. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Deepak Akkil, Jobin Mathew James, Poika Isokoski, and Jari Kangas. 2016. GazeTorch: Enabling Gaze Awareness in Collaborative Physical Tasks. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 1151--1158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Paul D Allopenna, James S Magnuson, and Michael K Tanenhaus. 1998. Tracking the time course of spoken word recognition using eye movements: Evidence for continuous mapping models. Journal of memory and language 38, 4 (1998), 419--439.Google ScholarGoogle ScholarCross RefCross Ref
  4. Michael Argyle. 1973. Social interaction. Vol. 103. Transaction Publishers.Google ScholarGoogle Scholar
  5. Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition 106, 3 (2008), 1465--1477.Google ScholarGoogle ScholarCross RefCross Ref
  6. Mark S Cary. 1978. The role of gaze in the initiation of conversation. Social Psychology (1978), 269--271.Google ScholarGoogle Scholar
  7. Mauro Cherubini, Marc-Antoine Nüssli, and Pierre Dillenbourg. 2008. Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandings. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 173--180. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Sarah D'Angelo and Andrew Begel. 2017. Improving communication between pair programmers using shared gaze awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 6245--6290. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Xujiong Dong, Haofei Wang, Zhaokang Chen, and Bertram E Shi. 2015. Hybrid brain computer interface via Bayesian integration of EEG and eye gaze. In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 150--153.Google ScholarGoogle ScholarCross RefCross Ref
  10. Goffman Erving. 1963. Behavior in public places.Google ScholarGoogle Scholar
  11. Andreas Geiger, Frank Moosmann, Omer Car, and Bernhard Schuster. 2012. Automatic camera and range sensor calibration using a single shot. In Robotics and Automation (ICRA), 2012 IEEE International Conference on. IEEE, 3936--3943.Google ScholarGoogle ScholarCross RefCross Ref
  12. James J Gibson and Anne D Pick. 1963. Perception of another person's looking behavior. The American journal of psychology 76, 3 (1963), 386--394.Google ScholarGoogle Scholar
  13. Scott A Green, Mark Billinghurst, XiaoQi Chen, and J Geoffrey Chase. 2008. Human-robot collaboration: A literature review and augmented reality approach in design. International journal of advanced robotic systems 5, 1 (2008), 1.Google ScholarGoogle ScholarCross RefCross Ref
  14. Zenzi M Griffin and Kathryn Bock. 2000. What the eyes say about speaking. Psychological science 11, 4 (2000), 274--279.Google ScholarGoogle Scholar
  15. Joy E Hanna and Susan E Brennan. 2007. SpeakersâĂŹ eye gaze disambiguates referring expressions early during face-to-face conversation. Journal of Memory and Language 57, 4 (2007), 596--615.Google ScholarGoogle ScholarCross RefCross Ref
  16. Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5180--5190. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Sotaro Kita. 2003. Pointing: Where language, culture, and cognition meet. Psychology Press.Google ScholarGoogle ScholarCross RefCross Ref
  18. Jörg Krüger, Terje K Lien, and Alexander Verl. 2009. Cooperation of human and machines in assembly lines. CIRP Annals-Manufacturing Technology 58, 2 (2009), 628--646.Google ScholarGoogle ScholarCross RefCross Ref
  19. Songpo Li, Xiaoli Zhang, and Jeremy Webb. 2017. 3D-gaze-based robotic grasping through mimicking human visuomotor function for people with motion impairments. IEEE Transactions on Biomedical Engineering (2017).Google ScholarGoogle Scholar
  20. Mark B Neider, Xin Chen, Christopher A Dickinson, Susan E Brennan, and Gregory J Zelinsky. 2010. Coordinating spatial referencing using shared gaze. Psychonomic bulletin & review 17, 5 (2010), 718--724.Google ScholarGoogle Scholar
  21. Marc-Antoine Nussli. 2011. Dual eye-tracking methods for the study of remote collaborative problem solving. (2011).Google ScholarGoogle Scholar
  22. Mai Otsuki, Keita Maruyama, Hideaki Kuzuoka, and Yusuke SUZUKI. 2018. Effects of Enhanced Gaze Presentation on Gaze Leading in Remote Collaborative Physical Tasks. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 368. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Daniel C Richardson, Rick Dale, and Natasha Z Kirkham. 2007. The art ofconversation is coordination. Psychological science 18, 5 (2007), 407--413.Google ScholarGoogle Scholar
  24. Federico Rossano. 2012. Gaze in conversation. The handbook of conversation analysis (2012), 308--329.Google ScholarGoogle Scholar
  25. Christian Schlösser, Philipp Schlieker-Steens, Andrea Kienle, and Andreas Harrer. 2015. Using real-time gaze based awareness methods to enhance collaboration. In CYTED-RITOS International Workshopon Groupware. Springer, 19--27.Google ScholarGoogle ScholarCross RefCross Ref
  26. Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-supported collaborative learning 8, 4 (2013), 375--397.Google ScholarGoogle ScholarCross RefCross Ref
  27. Kshitij Sharma, Jennifer K Olsen, Vincent Aleven, and Nikol Rummel. 2018. Exploring Causality Within Collaborative Problem Solving Using Eye-Tracking. In European Conference on Technology Enhanced Learning. Springer, 412--426.Google ScholarGoogle Scholar
  28. Randy Stein and Susan E Brennan. 2004. Another person's eye gaze as a cue in solving programming problems. In Proceedings of the 6th international conference on Multimodal interfaces. ACM, 9--15. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Michael Tomasello et al. 1995. Joint attention as social cognition. Joint attention: Its origins and role in development 103130 (1995).Google ScholarGoogle Scholar
  30. Vincent Van Rheden, Bernhard Maurer, Dorothe Smit, Martin Murer, and Manfred Tscheligi. 2017. LaserViz: Shared gaze in the Co-located physical world. In Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction. ACM, 191--196. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Boris M Velichkovsky. 1995. Communicating attention: Gaze position transfer in cooperative problem solving. Pragmatics & Cognition 3, 2 (1995), 199--223.Google ScholarGoogle ScholarCross RefCross Ref
  32. Haofei Wang, Marco Antonelli, and Bertram E Shi. 2017. Using point cloud data to improve three dimensional gaze estimation. In Engineering in Medicine and Biology Society (EMBC), 2017 39th Annual International Conference of the IEEE. IEEE, 795--798.Google ScholarGoogle ScholarCross RefCross Ref
  33. Haofei Wang, Xujiong Dong, Zhaokang Chen, and Bertram E Shi. 2015. Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 1476--1479.Google ScholarGoogle ScholarCross RefCross Ref
  34. Haofei Wang, Jimin Pi, Tong Qin, Shaojie Shen, and Bertram E Shi. 2018. SLAM-based localization of 3D gaze using a mobile eye tracker. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 65. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173--186. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Gaze awareness improves collaboration efficiency in a collaborative assembly task

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
      June 2019
      623 pages
      ISBN:9781450367097
      DOI:10.1145/3314111

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 25 June 2019

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • short-paper

      Acceptance Rates

      Overall Acceptance Rate69of137submissions,50%

      Upcoming Conference

      ETRA '24
      The 2024 Symposium on Eye Tracking Research and Applications
      June 4 - 7, 2024
      Glasgow , United Kingdom

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader