skip to main content
10.1145/3317959.3321492acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Gaze awareness improves collaboration efficiency in a collaborative assembly task

Published: 25 June 2019 Publication History

Abstract

In building human robot interaction systems, it would be helpful to understand how humans collaborate, and in particular, how humans use others' gaze behavior to estimate their intent. Here we studied the use of gaze in a collaborative assembly task, where a human user assembled an object with the assistance of a human helper. We found that the being aware of the partner's gaze significantly improved collaboration efficiency. Task completion times were much shorter when gaze communication was available, than when it was blocked. In addition, we found that the user's gaze was more likely to lie on the object of interest in the gaze-aware case than the gaze-blocked case. In the context of human-robot collaboration systems, our results suggest that gaze data in the period surrounding verbal requests will be more informative and can be used to predict the target object.

References

[1]
Deepak Akkil and Poika Isokoski. 2018. I see what you see: gaze awareness in mobile video collaboration. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 32.
[2]
Deepak Akkil, Jobin Mathew James, Poika Isokoski, and Jari Kangas. 2016. GazeTorch: Enabling Gaze Awareness in Collaborative Physical Tasks. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 1151--1158.
[3]
Paul D Allopenna, James S Magnuson, and Michael K Tanenhaus. 1998. Tracking the time course of spoken word recognition using eye movements: Evidence for continuous mapping models. Journal of memory and language 38, 4 (1998), 419--439.
[4]
Michael Argyle. 1973. Social interaction. Vol. 103. Transaction Publishers.
[5]
Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition 106, 3 (2008), 1465--1477.
[6]
Mark S Cary. 1978. The role of gaze in the initiation of conversation. Social Psychology (1978), 269--271.
[7]
Mauro Cherubini, Marc-Antoine Nüssli, and Pierre Dillenbourg. 2008. Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandings. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 173--180.
[8]
Sarah D'Angelo and Andrew Begel. 2017. Improving communication between pair programmers using shared gaze awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 6245--6290.
[9]
Xujiong Dong, Haofei Wang, Zhaokang Chen, and Bertram E Shi. 2015. Hybrid brain computer interface via Bayesian integration of EEG and eye gaze. In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 150--153.
[10]
Goffman Erving. 1963. Behavior in public places.
[11]
Andreas Geiger, Frank Moosmann, Omer Car, and Bernhard Schuster. 2012. Automatic camera and range sensor calibration using a single shot. In Robotics and Automation (ICRA), 2012 IEEE International Conference on. IEEE, 3936--3943.
[12]
James J Gibson and Anne D Pick. 1963. Perception of another person's looking behavior. The American journal of psychology 76, 3 (1963), 386--394.
[13]
Scott A Green, Mark Billinghurst, XiaoQi Chen, and J Geoffrey Chase. 2008. Human-robot collaboration: A literature review and augmented reality approach in design. International journal of advanced robotic systems 5, 1 (2008), 1.
[14]
Zenzi M Griffin and Kathryn Bock. 2000. What the eyes say about speaking. Psychological science 11, 4 (2000), 274--279.
[15]
Joy E Hanna and Susan E Brennan. 2007. SpeakersâĂŹ eye gaze disambiguates referring expressions early during face-to-face conversation. Journal of Memory and Language 57, 4 (2007), 596--615.
[16]
Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5180--5190.
[17]
Sotaro Kita. 2003. Pointing: Where language, culture, and cognition meet. Psychology Press.
[18]
Jörg Krüger, Terje K Lien, and Alexander Verl. 2009. Cooperation of human and machines in assembly lines. CIRP Annals-Manufacturing Technology 58, 2 (2009), 628--646.
[19]
Songpo Li, Xiaoli Zhang, and Jeremy Webb. 2017. 3D-gaze-based robotic grasping through mimicking human visuomotor function for people with motion impairments. IEEE Transactions on Biomedical Engineering (2017).
[20]
Mark B Neider, Xin Chen, Christopher A Dickinson, Susan E Brennan, and Gregory J Zelinsky. 2010. Coordinating spatial referencing using shared gaze. Psychonomic bulletin & review 17, 5 (2010), 718--724.
[21]
Marc-Antoine Nussli. 2011. Dual eye-tracking methods for the study of remote collaborative problem solving. (2011).
[22]
Mai Otsuki, Keita Maruyama, Hideaki Kuzuoka, and Yusuke SUZUKI. 2018. Effects of Enhanced Gaze Presentation on Gaze Leading in Remote Collaborative Physical Tasks. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 368.
[23]
Daniel C Richardson, Rick Dale, and Natasha Z Kirkham. 2007. The art ofconversation is coordination. Psychological science 18, 5 (2007), 407--413.
[24]
Federico Rossano. 2012. Gaze in conversation. The handbook of conversation analysis (2012), 308--329.
[25]
Christian Schlösser, Philipp Schlieker-Steens, Andrea Kienle, and Andreas Harrer. 2015. Using real-time gaze based awareness methods to enhance collaboration. In CYTED-RITOS International Workshopon Groupware. Springer, 19--27.
[26]
Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-supported collaborative learning 8, 4 (2013), 375--397.
[27]
Kshitij Sharma, Jennifer K Olsen, Vincent Aleven, and Nikol Rummel. 2018. Exploring Causality Within Collaborative Problem Solving Using Eye-Tracking. In European Conference on Technology Enhanced Learning. Springer, 412--426.
[28]
Randy Stein and Susan E Brennan. 2004. Another person's eye gaze as a cue in solving programming problems. In Proceedings of the 6th international conference on Multimodal interfaces. ACM, 9--15.
[29]
Michael Tomasello et al. 1995. Joint attention as social cognition. Joint attention: Its origins and role in development 103130 (1995).
[30]
Vincent Van Rheden, Bernhard Maurer, Dorothe Smit, Martin Murer, and Manfred Tscheligi. 2017. LaserViz: Shared gaze in the Co-located physical world. In Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction. ACM, 191--196.
[31]
Boris M Velichkovsky. 1995. Communicating attention: Gaze position transfer in cooperative problem solving. Pragmatics & Cognition 3, 2 (1995), 199--223.
[32]
Haofei Wang, Marco Antonelli, and Bertram E Shi. 2017. Using point cloud data to improve three dimensional gaze estimation. In Engineering in Medicine and Biology Society (EMBC), 2017 39th Annual International Conference of the IEEE. IEEE, 795--798.
[33]
Haofei Wang, Xujiong Dong, Zhaokang Chen, and Bertram E Shi. 2015. Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 1476--1479.
[34]
Haofei Wang, Jimin Pi, Tong Qin, Shaojie Shen, and Bertram E Shi. 2018. SLAM-based localization of 3D gaze using a mobile eye tracker. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 65.
[35]
Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173--186.

Cited By

View all
  • (2024)Joint Attention on the Future: Pro-Ecological Attitudes Change In Collaboration.Proceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3655100(1-3)Online publication date: 4-Jun-2024
  • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
  • (2024)Appearance-Based Gaze Estimation With Deep Learning: A Review and BenchmarkIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.339357146:12(7509-7528)Online publication date: Dec-2024
  • Show More Cited By

Index Terms

  1. Gaze awareness improves collaboration efficiency in a collaborative assembly task

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
    June 2019
    623 pages
    ISBN:9781450367097
    DOI:10.1145/3314111
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 June 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. 3D gaze estimation
    2. gaze awareness
    3. gaze tracking
    4. human robot collaboration

    Qualifiers

    • Short-paper

    Conference

    ETRA '19

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)48
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 12 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Joint Attention on the Future: Pro-Ecological Attitudes Change In Collaboration.Proceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3655100(1-3)Online publication date: 4-Jun-2024
    • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
    • (2024)Appearance-Based Gaze Estimation With Deep Learning: A Review and BenchmarkIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.339357146:12(7509-7528)Online publication date: Dec-2024
    • (2024)Augmenting collaborative interaction with shared visualization of eye movement and gesture in VRComputer Animation and Virtual Worlds10.1002/cav.226435:3Online publication date: 4-Jun-2024
    • (2023)Collaboration Assistance Through Object Based User Intent Detection Using Gaze DataProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3590130(1-2)Online publication date: 30-May-2023
    • (2023)Supporting Complex Decision-Making: Evidence from an Eye Tracking Study on In-Person and Remote CollaborationACM Transactions on Computer-Human Interaction10.1145/358178730:5(1-27)Online publication date: 23-Sep-2023
    • (2023)MRMAC: Mixed Reality Multi-user Asymmetric Collaboration2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00074(591-600)Online publication date: 16-Oct-2023
    • (2023)3D Gaze Vis: Sharing Eye Tracking Data Visualization for Collaborative Work in VR EnvironmentComputer Supported Cooperative Work and Social Computing10.1007/978-981-99-2385-4_46(610-621)Online publication date: 13-May-2023
    • (2021)GazeEMD: Detecting Visual Intention in Gaze-Based Human-Robot InteractionRobotics10.3390/robotics1002006810:2(68)Online publication date: 30-Apr-2021
    • (2021)VR Collaborative Object Manipulation Based on Viewpoint Quality2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR52148.2021.00020(60-68)Online publication date: Oct-2021

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media