skip to main content
10.1145/3491101.3519792acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
poster

EyeMeet: A Joint Attention Support System for Remote Meetings

Published:28 April 2022Publication History

ABSTRACT

A major challenge in remote meetings is that awareness cues, such as gaze, become degraded despite playing a crucial role in communication and establishing joint attention. Eye tracking allows overcoming these obstacles by enabling augmentation of remote meetings with gaze information. In this project, we followed a participatory approach by first distributing a scenario-based survey to students (n=79) to uncover their preference of eye-based joint attention support (real-time, retrospective, real-time & retrospective, no) for remote university meetings. Building on these findings, we developed EyeMeet, an eye-based joint attention support system that combines state-of-the-art real-time joint attention support with a retrospective attention feedback for remote meetings. In a four-week study, two student groups worked remotely on course assignments using EyeMeet. Our findings of the study highlight that EyeMeets supports students in staying more focused on the meetings. Complementing real-time joint attention support, retrospective joint attention feedback is recognized to provide valuable support for reflecting and adapting behavior for upcoming meetings.

Skip Supplemental Material Section

Supplemental Material

3491101.3519792-talk-video.mp4

mp4

8.2 MB

3491101.3519792-video-preview.mp4

mp4

1.8 MB

References

  1. Deepak Akkil, Biju Thankachan, and Poika Isokoski. 2018. I see what you see: Gaze awareness in mobile video collaboration. In Eye Tracking Research and Applications Symposium (ETRA). Association for Computing Machinery, New York, NY, USA, 9. https://doi.org/10.1145/3204493.3204542Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Stephan A. Brandt and Lawrence W. Stark. 1997. Spontaneous eye movements during visual imagery reflect the content of the visual scene. Journal of Cognitive Neuroscience 9, 1 (1997), 27–38.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Susan E Brennan, Xin Chen, Christopher A Dickinson, Mark B Neider, and Gregory J Zelinsky. 2008. Coordinating cognition : The costs and benefits of shared gaze during collaborative search. Cognition 106(2008), 1465–1477.Google ScholarGoogle ScholarCross RefCross Ref
  4. Sarah D’Angelo and Andrew Begel. 2017. Improving Communication Between Pair Programmers Using Shared Gaze Awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 6245–6255.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Sarah D’Angelo and Darren Gergle. 2016. Gazed and Confused : Understanding and Designing Shared Gaze for Remote Collaboration. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 2492–2496.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Sarah D’Angelo and Darren Gergle. 2018. An Eye For Design: Gaze Visualizations for Remote Collaborative Work. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1–12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Fred D. Davis. 1989. Perceived Usefulness, Perceived Easeof Use, and User Acceptance of Information Technology. MIS Quarterly 13, 3 (1989), 319–340.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Paul Dourish and Victoria Bellotti. 1992. Awareness and coordination in shared workspaces. In Proceedings of the Conference on Computer-Supported Cooperative Work. Association for Computing Machinery, New York, NY, USA, 107–114. https://doi.org/10.1145/143457.143468Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Sarah D’Angelo and Bertrand Schneider. 2021. Shared Gaze Visualizations in Collaborative Interactions: Past, Present and Future. Interacting with Computers 00, 0 (2021), 19. https://doi.org/10.1093/iwcomp/iwab015Google ScholarGoogle Scholar
  10. Mohamed Ez-zaouia, Aurélien Tabard, and Elise Lavoué. 2020. EMODASH: A dashboard supporting retrospective awareness of emotions in online learning. International Journal of Human Computer Studies 139, February(2020), 102411. https://doi.org/10.1016/j.ijhcs.2020.102411Google ScholarGoogle Scholar
  11. Stephen Few. 2006. Information Dashboard Design. O’Reilly, Sebastopol, CA.Google ScholarGoogle Scholar
  12. Kunal Gupta, Gun A. Lee, and Mark Billinghurst. 2016. Do you see what i see? the effect of gaze tracking on task space remote collaboration. IEEE Transactions on Visualization and Computer Graphics 22, 11(2016), 2413–2422.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Carl Gutwin and Saul Greenberg. 2002. A descriptive framework of workspace awareness for real-time groupware. Computer Supported Cooperative Work 11, 3-4 (2002), 411–446. https://doi.org/10.1023/A:1021271517844Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Keita Higuch, Ryo Yonetani, and Yoichi Sato. 2016. Can Eye Help You?. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 5180–5190.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Robert E Kraut, Susan R Fussell, Susan E Brennan, and Jane Siege. 1990. Understanding Effects of Proximity on Collaboration: Implications for Technologies to Support Remote Collaborative Work. In Distributed Work. MIT Press, Cambridge, MA, USA, 137–162.Google ScholarGoogle Scholar
  16. Grete H.; Kütt, Kevin; Lee, Ethan; Hardacre, and Alexandra; Papoutsaki. 2019. Eye-Write: Gaze Sharing for Collaborative Writing. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12.Google ScholarGoogle Scholar
  17. Grete H. Kütt, Teerapaun Tanprasert, Jay Rodolitz, Bernardo Moyza, Samuel So, Georgia Kenderova, and Alexandra Papoutsaki. 2020. Effects of Shared Gaze on Audio- Versus Text-Based Remote Collaborations. Proceedings ACM Human-Comupter Interaction 4, CSCW2 (10 2020), 25.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Anastasia Kuzminykh and Sean Rintel. 2020. Classification of Functional Attention in Video Meetings. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376546Google ScholarGoogle Scholar
  19. Moritz Langner, Peyman Toreini, and Alexander Maedche. 2020. AttentionBoard: A Quantified-Self Dashboard for Enhancing Attention Management with Eye-Tracking. In Information Systems and Neuroscience. NeuroIS 2020. Lecture Notes in Information Systems and Organisation. Springer, Cham, 266–275. http://link.springer.com/10.1007/978-3-030-60073-0_31Google ScholarGoogle ScholarCross RefCross Ref
  20. Gilly Leshed, Diego Perez, Jeffrey T. Hancock, Dan Cosley, Jeremy Birnholtz, Soyoung Lee, Poppy L. McLeod, and Geri Gay. 2009. Visualizing real-time language-based feedback on teamwork behavior in computer-mediated groups. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery, New York, NY, USA, 537–546. https://doi.org/10.1145/1518701.1518784Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Romy Müller, Jens R. Helmert, Sebastian Pannasch, and Boris M. Velichkovsky. 2012. Gaze transfer in remote cooperation: Is it always helpful to see what your partner is attending to?Quarterly Journal of Experimental Psychology 66, 7(2012), 1302–1316.Google ScholarGoogle Scholar
  22. Mark B. Neider, Xin Chen, Christopher A. Dickinson, Susan E. Brennan, and Gre Gory J. Zelinsky. 2010. Coordinating spatial referencing using shared gaze. Psychonomic Bulletin and Review 17, 5 (2010), 718–724.Google ScholarGoogle ScholarCross RefCross Ref
  23. Pernilla Qvarfordt and Matthew Lee. 2018. Gaze patterns during remote presentations while listening and speaking. In Eye Tracking Research and Applications Symposium (ETRA). Association for Computing Machinery, New York, NY, USA, 9. https://doi.org/10.1145/3204493.3204540Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Daniel C. Richardson and Rick Dale. 2005. Looking To Understand: The Coupling Between Speakers’ and Listeners’ Eye Movements and Its Relationship to Discourse Comprehension. Cognitive Science 29, 6 (2005), 1045–1060.Google ScholarGoogle ScholarCross RefCross Ref
  25. Samiha Samrose, Daniel Mcduff, Robert Sim, Jina Suh, Kael Rowan, Javier Hernandez, Kevin Moynihan, and Mary Czerwinski. 2021. MeetingCoach : An Intelligent Dashboard for Supporting Effective & Inclusive Meetings. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1–13.Google ScholarGoogle Scholar
  26. Samiha Samrose, Ru Zhao, Jeffery White, Vivian Li, Luis Nova, Yichen Lu, Mohammad Rafayet Ali, and Mohammed Ehsan Hoque. 2018. CoCo: Collaboration Coach for Understanding Team Dynamics during Video Conferencing. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 4 (2018), 1–24. https://doi.org/10.1145/3161186Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Bertrand Schneider and Roy Pea. 2013. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-Supported Collaborative Learning 8, 4(2013), 375–397.Google ScholarGoogle ScholarCross RefCross Ref
  28. Beat A. Schwendimann, Maria Jesus Rodriguez-Triana, Andrii Vozniuk, Luis P. Prieto, Mina Shirvani Boroujeni, Adrian Holzer, Denis Gillet, and Pierre Dillenbourg. 2017. Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies 10, 1 (2017), 30–41. https://doi.org/10.1109/TLT.2016.2599522Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Kshitij Sharma, Sarah D’Angelo, Darren Gergle, and Pierre Dillenbourg. 2016. Visual augmentation of deictic gestures in MOOC videos. Proceedings of International Conference of the Learning Sciences, ICLS 1, 2012(2016), 202–209.Google ScholarGoogle Scholar
  30. Harri Siirtola, Oleg Špakov, Howell Istance, and Kari Jouko Räihä. 2019. Shared Gaze in Collaborative Visual Search. International Journal of Human-Computer Interaction 35, 18(2019), 1693–1705.Google ScholarGoogle ScholarCross RefCross Ref
  31. Ricarda Steinmayr, Mattias Ziegler, and Birgit Träuble. 2010. Do intelligence and sustained attention interact in predicting academic achievement?Learning and Individual Differences 20, 1 (2010), 14–18. https://doi.org/10.1016/j.lindif.2009.10.009Google ScholarGoogle Scholar
  32. Peyman Toreini, Moritz Langner, Alexander Maedche, Stefan Morana, and Tobias Vogel. 2022. Designing Attentive Information Dashboards. Journal of the Association for Information Systems 23, 2 (2022), Forthcoming.Google ScholarGoogle Scholar
  33. Katrien Verbert, Sten Govaerts, Erik Duval, Jose Luis Santos, Frans Van Assche, Gonzalo Parra, and Joris Klerkx. 2014. Learning dashboards: An overview and future research opportunities. Personal and Ubiquitous Computing 18, 6 (2014), 1499–1514.Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173–186.Google ScholarGoogle ScholarDigital LibraryDigital Library

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    CHI EA '22: Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems
    April 2022
    3066 pages
    ISBN:9781450391566
    DOI:10.1145/3491101

    Copyright © 2022 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 28 April 2022

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • poster
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate6,164of23,696submissions,26%

    Upcoming Conference

    CHI '24
    CHI Conference on Human Factors in Computing Systems
    May 11 - 16, 2024
    Honolulu , HI , USA

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format