ABSTRACT
A major challenge in remote meetings is that awareness cues, such as gaze, become degraded despite playing a crucial role in communication and establishing joint attention. Eye tracking allows overcoming these obstacles by enabling augmentation of remote meetings with gaze information. In this project, we followed a participatory approach by first distributing a scenario-based survey to students (n=79) to uncover their preference of eye-based joint attention support (real-time, retrospective, real-time & retrospective, no) for remote university meetings. Building on these findings, we developed EyeMeet, an eye-based joint attention support system that combines state-of-the-art real-time joint attention support with a retrospective attention feedback for remote meetings. In a four-week study, two student groups worked remotely on course assignments using EyeMeet. Our findings of the study highlight that EyeMeets supports students in staying more focused on the meetings. Complementing real-time joint attention support, retrospective joint attention feedback is recognized to provide valuable support for reflecting and adapting behavior for upcoming meetings.
Supplemental Material
- Deepak Akkil, Biju Thankachan, and Poika Isokoski. 2018. I see what you see: Gaze awareness in mobile video collaboration. In Eye Tracking Research and Applications Symposium (ETRA). Association for Computing Machinery, New York, NY, USA, 9. https://doi.org/10.1145/3204493.3204542Google ScholarDigital Library
- Stephan A. Brandt and Lawrence W. Stark. 1997. Spontaneous eye movements during visual imagery reflect the content of the visual scene. Journal of Cognitive Neuroscience 9, 1 (1997), 27–38.Google ScholarDigital Library
- Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition : The costs and benefits of shared gaze during collaborative search. Cognition 106(2008), 1465–1477.Google ScholarCross Ref
- Sarah D’Angelo and Andrew Begel. 2017. Improving Communication Between Pair Programmers Using Shared Gaze Awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 6245–6255.Google ScholarDigital Library
- Sarah D’Angelo and Darren Gergle. 2016. Gazed and Confused : Understanding and Designing Shared Gaze for Remote Collaboration. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 2492–2496.Google ScholarDigital Library
- Sarah D’Angelo and Darren Gergle. 2018. An Eye For Design: Gaze Visualizations for Remote Collaborative Work. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1–12.Google ScholarDigital Library
- Fred D. Davis. 1989. Perceived Usefulness, Perceived Easeof Use, and User Acceptance of Information Technology. MIS Quarterly 13, 3 (1989), 319–340.Google ScholarDigital Library
- Paul Dourish and Victoria Bellotti. 1992. Awareness and coordination in shared workspaces. In Proceedings of the Conference on Computer-Supported Cooperative Work. Association for Computing Machinery, New York, NY, USA, 107–114. https://doi.org/10.1145/143457.143468Google ScholarDigital Library
- Sarah D’Angelo and Bertrand Schneider. 2021. Shared Gaze Visualizations in Collaborative Interactions: Past, Present and Future. Interacting with Computers 00, 0 (2021), 19. https://doi.org/10.1093/iwcomp/iwab015Google Scholar
- Mohamed Ez-zaouia, Aurélien Tabard, and Elise Lavoué. 2020. EMODASH: A dashboard supporting retrospective awareness of emotions in online learning. International Journal of Human Computer Studies 139, February(2020), 102411. https://doi.org/10.1016/j.ijhcs.2020.102411Google Scholar
- Stephen Few. 2006. Information Dashboard Design. O’Reilly, Sebastopol, CA.Google Scholar
- Kunal Gupta, Gun A. Lee, and Mark Billinghurst. 2016. Do you see what i see? the effect of gaze tracking on task space remote collaboration. IEEE Transactions on Visualization and Computer Graphics 22, 11(2016), 2413–2422.Google ScholarDigital Library
- Carl Gutwin and Saul Greenberg. 2002. A descriptive framework of workspace awareness for real-time groupware. Computer Supported Cooperative Work 11, 3-4 (2002), 411–446. https://doi.org/10.1023/A:1021271517844Google ScholarDigital Library
- Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can Eye Help You?. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 5180–5190.Google ScholarDigital Library
- Robert E Kraut, Susan R Fussell, Susan E Brennan, and Jane Siege. 1990. Understanding Effects of Proximity on Collaboration: Implications for Technologies to Support Remote Collaborative Work. In Distributed Work. MIT Press, Cambridge, MA, USA, 137–162.Google Scholar
- Grete H.; Kütt, Kevin; Lee, Ethan; Hardacre, and Alexandra; Papoutsaki. 2019. Eye-Write: Gaze Sharing for Collaborative Writing. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12.Google Scholar
- Grete H. Kütt, Teerapaun Tanprasert, Jay Rodolitz, Bernardo Moyza, Samuel So, Georgia Kenderova, and Alexandra Papoutsaki. 2020. Effects of Shared Gaze on Audio- Versus Text-Based Remote Collaborations. Proceedings ACM Human-Comupter Interaction 4, CSCW2 (10 2020), 25.Google ScholarDigital Library
- Anastasia Kuzminykh and Sean Rintel. 2020. Classification of Functional Attention in Video Meetings. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376546Google Scholar
- Moritz Langner, Peyman Toreini, and Alexander Maedche. 2020. AttentionBoard: A Quantified-Self Dashboard for Enhancing Attention Management with Eye-Tracking. In Information Systems and Neuroscience. NeuroIS 2020. Lecture Notes in Information Systems and Organisation. Springer, Cham, 266–275. http://link.springer.com/10.1007/978-3-030-60073-0_31Google ScholarCross Ref
- Gilly Leshed, Diego Perez, Jeffrey T. Hancock, Dan Cosley, Jeremy Birnholtz, Soyoung Lee, Poppy L. McLeod, and Geri Gay. 2009. Visualizing real-time language-based feedback on teamwork behavior in computer-mediated groups. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery, New York, NY, USA, 537–546. https://doi.org/10.1145/1518701.1518784Google ScholarDigital Library
- Romy Müller, Jens R. Helmert, Sebastian Pannasch, and Boris M. Velichkovsky. 2012. Gaze transfer in remote cooperation: Is it always helpful to see what your partner is attending to?Quarterly Journal of Experimental Psychology 66, 7(2012), 1302–1316.Google Scholar
- Mark B. Neider, Xin Chen, Christopher A. Dickinson, Susan E. Brennan, and Gre Gory J. Zelinsky. 2010. Coordinating spatial referencing using shared gaze. Psychonomic Bulletin and Review 17, 5 (2010), 718–724.Google ScholarCross Ref
- Pernilla Qvarfordt and Matthew Lee. 2018. Gaze patterns during remote presentations while listening and speaking. In Eye Tracking Research and Applications Symposium (ETRA). Association for Computing Machinery, New York, NY, USA, 9. https://doi.org/10.1145/3204493.3204540Google ScholarDigital Library
- Daniel C. Richardson and Rick Dale. 2005. Looking To Understand: The Coupling Between Speakers’ and Listeners’ Eye Movements and Its Relationship to Discourse Comprehension. Cognitive Science 29, 6 (2005), 1045–1060.Google ScholarCross Ref
- Samiha Samrose, Daniel Mcduff, Robert Sim, Jina Suh, Kael Rowan, Javier Hernandez, Kevin Moynihan, and Mary Czerwinski. 2021. MeetingCoach : An Intelligent Dashboard for Supporting Effective & Inclusive Meetings. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1–13.Google Scholar
- Samiha Samrose, Ru Zhao, Jeffery White, Vivian Li, Luis Nova, Yichen Lu, Mohammad Rafayet Ali, and Mohammed Ehsan Hoque. 2018. CoCo: Collaboration Coach for Understanding Team Dynamics during Video Conferencing. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 4 (2018), 1–24. https://doi.org/10.1145/3161186Google ScholarDigital Library
- Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-Supported Collaborative Learning 8, 4(2013), 375–397.Google ScholarCross Ref
- Beat A. Schwendimann, Maria Jesus Rodriguez-Triana, Andrii Vozniuk, Luis P. Prieto, Mina Shirvani Boroujeni, Adrian Holzer, Denis Gillet, and Pierre Dillenbourg. 2017. Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies 10, 1 (2017), 30–41. https://doi.org/10.1109/TLT.2016.2599522Google ScholarDigital Library
- Kshitij Sharma, Sarah D’Angelo, Darren Gergle, and Pierre Dillenbourg. 2016. Visual augmentation of deictic gestures in MOOC videos. Proceedings of International Conference of the Learning Sciences, ICLS 1, 2012(2016), 202–209.Google Scholar
- Harri Siirtola, Oleg Špakov, Howell Istance, and Kari Jouko Räihä. 2019. Shared Gaze in Collaborative Visual Search. International Journal of Human-Computer Interaction 35, 18(2019), 1693–1705.Google ScholarCross Ref
- Ricarda Steinmayr, Mattias Ziegler, and Birgit Träuble. 2010. Do intelligence and sustained attention interact in predicting academic achievement?Learning and Individual Differences 20, 1 (2010), 14–18. https://doi.org/10.1016/j.lindif.2009.10.009Google Scholar
- Peyman Toreini, Moritz Langner, Alexander Maedche, Stefan Morana, and Tobias Vogel. 2022. Designing Attentive Information Dashboards. Journal of the Association for Information Systems 23, 2 (2022), Forthcoming.Google Scholar
- Katrien Verbert, Sten Govaerts, Erik Duval, Jose Luis Santos, Frans Van Assche, Gonzalo Parra, and Joris Klerkx. 2014. Learning dashboards: An overview and future research opportunities. Personal and Ubiquitous Computing 18, 6 (2014), 1499–1514.Google ScholarDigital Library
- Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173–186.Google ScholarDigital Library
Recommendations
Large Scale Analysis of Multitasking Behavior During Remote Meetings
CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing SystemsVirtual meetings are critical for remote work because of the need for synchronous collaboration in the absence of in-person interactions. In-meeting multitasking is closely linked to people’s productivity and wellbeing. However, we currently have ...
The facilitators perspective on meetings and implications for group support systems design
Based on research into group process facilitation, a meeting model is proposed that defines the many activities comprising group work and highlights the critical facilitator actions. Facilitating group work is a dynamic process that involves managing ...
An Eye Movement Study of Joint Attention Deficits in Children with Autism Spectrum Disorders
Intelligent Robotics and ApplicationsAbstractUsing eye tracking technology to explore the underlying processing mechanisms of joint attention in children with autism spectrum disorders (ASD). The experiment selected 32 ASD children and 34 IQ-matched typically developing (TD) children. By ...
Comments