ABSTRACT
In building human robot interaction systems, it would be helpful to understand how humans collaborate, and in particular, how humans use others' gaze behavior to estimate their intent. Here we studied the use of gaze in a collaborative assembly task, where a human user assembled an object with the assistance of a human helper. We found that the being aware of the partner's gaze significantly improved collaboration efficiency. Task completion times were much shorter when gaze communication was available, than when it was blocked. In addition, we found that the user's gaze was more likely to lie on the object of interest in the gaze-aware case than the gaze-blocked case. In the context of human-robot collaboration systems, our results suggest that gaze data in the period surrounding verbal requests will be more informative and can be used to predict the target object.
- Deepak Akkil and Poika Isokoski. 2018. I see what you see: gaze awareness in mobile video collaboration. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 32. Google ScholarDigital Library
- Deepak Akkil, Jobin Mathew James, Poika Isokoski, and Jari Kangas. 2016. GazeTorch: Enabling Gaze Awareness in Collaborative Physical Tasks. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 1151--1158. Google ScholarDigital Library
- Paul D Allopenna, James S Magnuson, and Michael K Tanenhaus. 1998. Tracking the time course of spoken word recognition using eye movements: Evidence for continuous mapping models. Journal of memory and language 38, 4 (1998), 419--439.Google ScholarCross Ref
- Michael Argyle. 1973. Social interaction. Vol. 103. Transaction Publishers.Google Scholar
- Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition 106, 3 (2008), 1465--1477.Google ScholarCross Ref
- Mark S Cary. 1978. The role of gaze in the initiation of conversation. Social Psychology (1978), 269--271.Google Scholar
- Mauro Cherubini, Marc-Antoine Nüssli, and Pierre Dillenbourg. 2008. Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandings. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 173--180. Google ScholarDigital Library
- Sarah D'Angelo and Andrew Begel. 2017. Improving communication between pair programmers using shared gaze awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 6245--6290. Google ScholarDigital Library
- Xujiong Dong, Haofei Wang, Zhaokang Chen, and Bertram E Shi. 2015. Hybrid brain computer interface via Bayesian integration of EEG and eye gaze. In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 150--153.Google ScholarCross Ref
- Goffman Erving. 1963. Behavior in public places.Google Scholar
- Andreas Geiger, Frank Moosmann, Omer Car, and Bernhard Schuster. 2012. Automatic camera and range sensor calibration using a single shot. In Robotics and Automation (ICRA), 2012 IEEE International Conference on. IEEE, 3936--3943.Google ScholarCross Ref
- James J Gibson and Anne D Pick. 1963. Perception of another person's looking behavior. The American journal of psychology 76, 3 (1963), 386--394.Google Scholar
- Scott A Green, Mark Billinghurst, XiaoQi Chen, and J Geoffrey Chase. 2008. Human-robot collaboration: A literature review and augmented reality approach in design. International journal of advanced robotic systems 5, 1 (2008), 1.Google ScholarCross Ref
- Zenzi M Griffin and Kathryn Bock. 2000. What the eyes say about speaking. Psychological science 11, 4 (2000), 274--279.Google Scholar
- Joy E Hanna and Susan E Brennan. 2007. SpeakersâĂŹ eye gaze disambiguates referring expressions early during face-to-face conversation. Journal of Memory and Language 57, 4 (2007), 596--615.Google ScholarCross Ref
- Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can Eye Help You?: Effects of Visualizing Eye Fixations on Remote Collaboration Scenarios for Physical Tasks. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5180--5190. Google ScholarDigital Library
- Sotaro Kita. 2003. Pointing: Where language, culture, and cognition meet. Psychology Press.Google ScholarCross Ref
- Jörg Krüger, Terje K Lien, and Alexander Verl. 2009. Cooperation of human and machines in assembly lines. CIRP Annals-Manufacturing Technology 58, 2 (2009), 628--646.Google ScholarCross Ref
- Songpo Li, Xiaoli Zhang, and Jeremy Webb. 2017. 3D-gaze-based robotic grasping through mimicking human visuomotor function for people with motion impairments. IEEE Transactions on Biomedical Engineering (2017).Google Scholar
- Mark B Neider, Xin Chen, Christopher A Dickinson, Susan E Brennan, and Gregory J Zelinsky. 2010. Coordinating spatial referencing using shared gaze. Psychonomic bulletin & review 17, 5 (2010), 718--724.Google Scholar
- Marc-Antoine Nussli. 2011. Dual eye-tracking methods for the study of remote collaborative problem solving. (2011).Google Scholar
- Mai Otsuki, Keita Maruyama, Hideaki Kuzuoka, and Yusuke SUZUKI. 2018. Effects of Enhanced Gaze Presentation on Gaze Leading in Remote Collaborative Physical Tasks. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 368. Google ScholarDigital Library
- Daniel C Richardson, Rick Dale, and Natasha Z Kirkham. 2007. The art ofconversation is coordination. Psychological science 18, 5 (2007), 407--413.Google Scholar
- Federico Rossano. 2012. Gaze in conversation. The handbook of conversation analysis (2012), 308--329.Google Scholar
- Christian Schlösser, Philipp Schlieker-Steens, Andrea Kienle, and Andreas Harrer. 2015. Using real-time gaze based awareness methods to enhance collaboration. In CYTED-RITOS International Workshopon Groupware. Springer, 19--27.Google ScholarCross Ref
- Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-supported collaborative learning 8, 4 (2013), 375--397.Google ScholarCross Ref
- Kshitij Sharma, Jennifer K Olsen, Vincent Aleven, and Nikol Rummel. 2018. Exploring Causality Within Collaborative Problem Solving Using Eye-Tracking. In European Conference on Technology Enhanced Learning. Springer, 412--426.Google Scholar
- Randy Stein and Susan E Brennan. 2004. Another person's eye gaze as a cue in solving programming problems. In Proceedings of the 6th international conference on Multimodal interfaces. ACM, 9--15. Google ScholarDigital Library
- Michael Tomasello et al. 1995. Joint attention as social cognition. Joint attention: Its origins and role in development 103130 (1995).Google Scholar
- Vincent Van Rheden, Bernhard Maurer, Dorothe Smit, Martin Murer, and Manfred Tscheligi. 2017. LaserViz: Shared gaze in the Co-located physical world. In Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction. ACM, 191--196. Google ScholarDigital Library
- Boris M Velichkovsky. 1995. Communicating attention: Gaze position transfer in cooperative problem solving. Pragmatics & Cognition 3, 2 (1995), 199--223.Google ScholarCross Ref
- Haofei Wang, Marco Antonelli, and Bertram E Shi. 2017. Using point cloud data to improve three dimensional gaze estimation. In Engineering in Medicine and Biology Society (EMBC), 2017 39th Annual International Conference of the IEEE. IEEE, 795--798.Google ScholarCross Ref
- Haofei Wang, Xujiong Dong, Zhaokang Chen, and Bertram E Shi. 2015. Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 1476--1479.Google ScholarCross Ref
- Haofei Wang, Jimin Pi, Tong Qin, Shaojie Shen, and Bertram E Shi. 2018. SLAM-based localization of 3D gaze using a mobile eye tracker. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 65. Google ScholarDigital Library
- Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173--186. Google ScholarDigital Library
Index Terms
- Gaze awareness improves collaboration efficiency in a collaborative assembly task
Recommendations
Weak gaze awareness in video-mediated communication
CHI EA '05: CHI '05 Extended Abstracts on Human Factors in Computing SystemsWe present a video mediated communication system that conveys gaze information to a remote location. Unlike existing video mediated communication system, this system does not send visual information directly, only gaze position and face direction. The ...
Effects of Enhanced Gaze Presentation on Gaze Leading in Remote Collaborative Physical Tasks
CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing SystemsWith respect to collaborative physical tasks, gaze and gestures play significant roles when referring to physical objects. In video-mediated communication, however, such nonverbal cues become "ineffectual" when they are presented via a 2D monitor, ...
Gaze perception and awareness in smart devices
Eye contact and gaze awareness play a significant role for conveying emotions and intentions during face-to-face conversation. Humans can perceive each other's gaze quite naturally and accurately. However, the gaze awareness/perception are ambiguous ...
Comments