ABSTRACT
As eye-tracking technology becomes increasingly prevalent in augmented reality (AR), new opportunities arise for collaborative applications. In this paper, we propose a novel approach to improve collaborative interaction through object-based user intent detection using gaze data. Our system uses reinforcement learning (RL) to dynamically adapt the user interface based on the context of the collaborative task. The system visualizes the user’s intent on a shared environment, allowing for improved collaborative awareness between users. We evaluate our approach in a user study scenario focused on visual search tasks. The results demonstrate that our system significantly improves task completion times and reduces cognitive load for users. Additionally, subjective feedback suggests that users are more aware of each other’s activity, further highlighting the benefits of our approach. We encourage conducting future user studies to assess the suitability of our approach for additional collaborative tasks.
- [David-John et al., 2021] Brendan David-John, Candace Peacock, Ting Zhang, T. Scott Murdison, Hrvoje Benko, and Tanya R. Jonker. 2021. Towards Gaze-Based Prediction of the Intent to Interact in Virtual Reality. In ACM Symposium on Eye Tracking Research and Applications (Virtual Event, Germany) (ETRA ’21 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 2, 7 pages. https://doi.org/10.1145/3448018.3458008Google ScholarDigital Library
- [Duchowski et al., 2004] Andrew T. Duchowski, Nathan Cournia, Brian Cumming, Daniel McCallum, Anand Gramopadhye, Joel Greenstein, Sajay Sadasivan, and Richard A. Tyrrell. 2004. Visual Deictic Reference in a Collaborative Virtual Environment. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (San Antonio, Texas) (ETRA ’04). Association for Computing Machinery, New York, NY, USA, 35–40. https://doi.org/10.1145/968363.968369Google ScholarDigital Library
- [Gebhardt et al., 2019] Christoph Gebhardt, Brian Hecox, Bas van Opheusden, Daniel Wigdor, James Hillis, Otmar Hilliges, and Hrvoje Benko. 2019. Learning Cooperative Personalized Policies from Gaze Data. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 197–208. https://doi.org/10.1145/3332165.3347933Google ScholarDigital Library
- [Huang et al., 2015] Chien-Ming Huang, Sean Andrist, Allison Sauppé, and Bilge Mutlu. 2015. Using gaze patterns to predict task intent in collaboration. Frontiers in Psychology 6 (2015). https://doi.org/10.3389/fpsyg.2015.01049Google Scholar
- [Pfeuffer et al., 2021] Ken Pfeuffer, Jason Alexander, and Hans Gellersen. 2021. Multi-User Gaze-Based Interaction Techniques on Collaborative Touchscreens. In ACM Symposium on Eye Tracking Research and Applications (Virtual Event, Germany) (ETRA ’21 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 26, 7 pages. https://doi.org/10.1145/3448018.3458016Google ScholarDigital Library
- [Wang and Shi, 2019] Haofei Wang and Bertram E. Shi. 2019. Gaze Awareness Improves Collaboration Efficiency in a Collaborative Assembly Task. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (Denver, Colorado) (ETRA ’19). Association for Computing Machinery, New York, NY, USA, Article 88, 5 pages. https://doi.org/10.1145/3317959.3321492Google ScholarDigital Library
Index Terms
- Collaboration Assistance Through Object Based User Intent Detection Using Gaze Data
Recommendations
User Intent in Multimedia Search: A Survey of the State of the Art and Future Challenges
Today's multimedia search engines are expected to respond to queries reflecting a wide variety of information needs from users with different goals. The topical dimension (“what” the user is searching for) of these information needs is well studied; ...
Aligning Vertical Collection Relevance with User Intent
CIKM '14: Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge ManagementSelecting and aggregating different types of content from multiple vertical search engines is becoming popular in web search. The user vertical intent, the verticals the user expects to be relevant for a particular information need, might not correspond ...
Intent and its discontents: the user at the wheel of the online video search engine
MM '12: Proceedings of the 20th ACM international conference on MultimediaWe embrace the position of the user in the driver's seat of the video search engine by proposing a principled framework for multimedia retrieval that moves beyond what users are searching for also to encompass why they search. This 'why' is understood ...
Comments