ABSTRACT
In video remote communication, the Gaze Cue, which visualizes the direction of a partner’s gaze, effectively enhances topic sharing by facilitating mutual gaze between users. However, since this Gaze Cue appears within the user’s field of view, viewers may feel obstructed by that. Additionally, when users want to share a topic, they promote topic sharing by verbally expressing their intentions, such as saying, “Look at that". In this study, assuming VR remote tourism, we experimented with a school tour experience. We examined the relationship between the speed of joint attention and the type of Gaze Cue switched in response to voice.
- Allison Jing, Kieran May, Brandon Matthews, Gun Lee, and Mark Billinghurst. 2022. The Impact of Sharing Gaze Behaviours in Collaborative Mixed Reality. Proc. ACM Hum.-Comput. Interact. 6, CSCW2, Article 463 (nov 2022), 27 pages. https://doi.org/10.1145/3555564Google ScholarDigital Library
- Minori Manabe, Daisuke Uriu, Takeshi Funatsu, Atsushi Izumihara, Takeru Yazaki, I-Hsin Chen, Yi-Ya Liao, Kang-Yi Liu, Ju-Chun Ko, Zendai Kashino, 2020. Exploring in the city with your personal guide: design and user study of t-leap, a telepresence system. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia. 96–106.Google ScholarDigital Library
- Kachach Redouane, Perez Pablo, Villegas Alvaro, and Gonzalez-Sosa Ester. 2020. Virtual Tour: An Immersive Low Cost Telepresence System. IEEE.Google Scholar
- Theophilus Teo, Gun A. Lee, Mark Billinghurst, and Matt Adcock. 2018. Hand Gestures and Visual Annotation in Live 360 Panorama-Based Mixed Reality Remote Collaboration. In Proceedings of the 30th Australian Conference on Computer-Human Interaction (Melbourne, Australia) (OzCHI ’18). Association for Computing Machinery, New York, NY, USA, 406–410. https://doi.org/10.1145/3292147.3292200Google ScholarDigital Library
- Takayoshi Yamada, Kelvin Cheng, Soh Masuko, and Keiichi Zempo. 2022. “This” and “That” in Teleshopping with Possessive Telepresence Systems using 5G Mobile Networks. In Proceedings of the Augmented Humans International Conference 2022. 326–329.Google Scholar
- Zhang Yanxia, Pfeuffer Ken, Chong Ming Ki, Alexander Jason, Bulling Andreas, and Gellersen Hans. 2017. Look together: using gaze for assisting co-located collaborative search. Proc. ACM Hum.-Comput. Interact. 21, CSCW2, Article 463 (2017), 173-186 pages. https://doi.org/10.1007/s00779-016-0969-xGoogle ScholarDigital Library
Index Terms
- Gaze Cue Switcher for Joint Attention in VR Remote Tourism
Recommendations
The Role of Gaze as a Deictic Cue in Human Robot Interaction
Augmented Cognition. Human Cognition and BehaviorAbstractGaze has a major role in social interaction. As a deictic reference, gaze aims at attracting visual attention of a communication partner to a referred entity in the environment. Gaze direction in natural faces is a well-investigated domain of ...
Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking
ICMI '07: Proceedings of the 9th international conference on Multimodal interfacesThis paper proposes a gaze-communicative stuffed-toy robot system with joint attention and eye-contact reactions based on ambient gaze-tracking. For free and natural interaction, we adopted our remote gaze-tracking method. Corresponding to the user's ...
Eye gaze and head gaze in collaborative games
ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & ApplicationsWe present an investigation of sharing the focus of visual attention between two players in a collaborative game, so that where one player was looking was visible to the other. The difference between using head-gaze and eye-gaze to estimate the point of ...
Comments