Skip to main content

A Preliminary Study on Eye Contact Framework Toward Improving Gaze Awareness in Video Conferences

  • Conference paper
  • First Online:
Human-Computer Interaction (HCII 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14011))

Included in the following conference series:

  • 1084 Accesses

Abstract

Gaze information plays an important role as non-verbal information in face-to-face conversations. However, in online videoconferences, users’ gaze is perceived as misaligned due to the different positions of the screen and the camera. This problem causes a lack of gaze information, such as gaze awareness. To solve this problem, gaze correction methods in videoconference have been extensively discussed, and these methods allow us to maintain eye contact with other participants even in videoconference. However, people rarely make constant eye contact with the other person in face-to-face conversations. Although a person’s gaze generally reflects their intentions, if the system unconditionally corrects gaze, the intention of the user’s gaze is incorrectly conveyed. Therefore, we conducted a preliminary study to develop an eye contact framework; a system that corrects the user’s gaze only when the system detects that the user is looking at the face of the videoconferencing participant. In this study, participants used this system in a online conference and evaluated it qualitatively. As a result, this prototype was not significant in the evaluation of gaze awareness, but useful feedback was obtained from the questionnaire. We will improve this prototype and aim to develop a framework to facilitate non-verbal communication in online videoconferences.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://google.github.io/mediapipe/solutions/face_detection.html.

  2. 2.

    https://corexy.com/.

References

  1. Argyle, M., Cook, M.: Gaze and mutual gaze (1976)

    Google Scholar 

  2. Bailenson, J.N.: Nonverbal overload: a theoretical argument for the causes of zoom fatigue. Technol. Mind Behav. 2(1) (2021). https://doi.org/10.1037/tmb0000030

  3. Chen, M.: Leveraging the asymmetric sensitivity of eye contact for videoconference. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 49–56. CHI 2002, Association for Computing Machinery, New York, NY, USA, April 2002. https://doi.org/10.1145/503376.503386

  4. Ganin, Y., Kononenko, D., Sungatullina, D., Lempitsky, V.: DeepWarp: photorealistic image resynthesis for gaze manipulation. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9906, pp. 311–326. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46475-6_20

    Chapter  Google Scholar 

  5. Hsu, C.F., Wang, Y.S., Lei, C.L., Chen, K.T.: Look at me! correcting eye gaze in live video communication. ACM Trans. Multimedia Comput. Commun. Appl. 15(2), 1–21 (2019). https://doi.org/10.1145/3311784

    Article  Google Scholar 

  6. Iitsuka, R., Kawaguchi, I., Shizuki, B., Takahashi, S.: Multi-party video conferencing system with gaze cues representation for turn-taking. In: Hernández-Leo, D., Hishiyama, R., Zurita, G., Weyers, B., Nolte, A., Ogata, H. (eds.) CollabTech 2021. LNCS, vol. 12856, pp. 101–108. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-85071-5_8

    Chapter  Google Scholar 

  7. Ishii, H., Kobayashi, M.: ClearBoard: a seamless medium for shared drawing and conversation with eye contact. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 1992, pp. 525–532. Association for Computing Machinery, New York, NY, USA, June 1992. https://doi.org/10.1145/142750.142977

  8. Jaklič, A., Solina, F., Šajn, L.: User interface for a better eye contact in videoconferencing. Displays 46, 25–36 (2017). https://doi.org/10.1016/j.displa.2016.12.002

    Article  Google Scholar 

  9. Kim, S., Billinghurst, M., Lee, G., Norman, M., Huang, W., He, J.: Sharing emotion by displaying a partner near the gaze point in a telepresence system. In: 2019 23rd International Conference in Information Visualization - Part II, pp. 86–91. ieeexplore.ieee.org, July 2019. https://doi.org/10.1109/IV-2.2019.00026

  10. van der Kleij, R., Maarten Schraagen, J., Werkhoven, P., De Dreu, C.K.W.: How conversations change over time in face-to-face and video-mediated communication. Small Group Res. 40(4), 355–381 (2009). https://doi.org/10.1177/1046496409333724

  11. Lee, P.S.N., Leung, L., Lo, V., Xiong, C., Wu, T.: Internet communication versus face-to-face interaction in quality of life. Soc. Indic. Res. 100(3), 375–389 (2011). https://doi.org/10.1007/s11205-010-9618-3

    Article  Google Scholar 

  12. Lombard, M., Ditton, T.B., Weinstein, L.: Measuring presence: the temple presence inventory. https://academic.csuohio.edu/kneuendorf/frames/LombardDittonWeinstein09.pdf, https://academic.csuohio.edu/kneuendorf/frames/LombardDittonWeinstein09.pdf. Accessed 4 Mar 2022

  13. Majaranta, P., Räihä, K.J.: Twenty years of eye typing: systems and design issues. In: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, ETRA 2002, pp. 15–22. Association for Computing Machinery, New York, NY, USA, March 2002. https://doi.org/10.1145/507072.507076

  14. Nguyen, D.T., Canny, J.: Multiview: improving trust in group video conferencing through spatial faithfulness. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1465–1474. CHI 2007, Association for Computing Machinery, New York, NY, USA, April 2007. https://doi.org/10.1145/1240624.1240846

  15. Nguyen, D.T., Canny, J.: More than face-to-face: empathy effects of video framing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2009, pp. 423–432. Association for Computing Machinery, New York, NY, USA, April 2009. https://doi.org/10.1145/1518701.1518770

  16. Okada, K.I., Maeda, F., Ichikawaa, Y., Matsushita, Y.: Multiparty videoconferencing at virtual social distance: MAJIC design. In: Proceedings of the 1994 ACM Conference on Computer Supported Cooperative Work, CSCW 1994, pp. 385–393. Association for Computing Machinery, New York, NY, USA, October 1994. https://doi.org/10.1145/192844.193054

  17. O’Malley, C., Langton, S., Anderson, A., Doherty-Sneddon, G., Bruce, V.: Comparison of face-to-face and video-mediated interaction. Interact. Comput. 8(2), 177–192 (1996). https://doi.org/10.1016/0953-5438(96)01027-2

    Article  Google Scholar 

  18. Stephenson, G.M., Rutter, D.R.: Eye-contact, distance and affiliation: a re-evaluation. Br. J. Psychol. 61(3), 385–393 (1970). https://doi.org/10.1111/j.2044-8295.1970.tb01257.x

    Article  Google Scholar 

  19. Tausif, M.T., Weaver, R.J., Lee, S.W.: Towards enabling eye contact and perspective control in video conference. In: Adjunct Publication of the 33rd Annual ACM Symposium on User Interface Software and Technology, UIST 2020 Adjunct, pp. 96–98, Association for Computing Machinery, New York, NY, USA, October 2020. https://doi.org/10.1145/3379350.3416197

  20. Vertegaal, R., Weevers, I., Sohn, C., Cheung, C.: GAZE-2: conveying eye contact in group video conferencing using eye-controlled camera direction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2003, pp. 521–528. Association for Computing Machinery, New York, NY, USA, April 2003. https://doi.org/10.1145/642611.642702

  21. Wood, E., Baltrušaitis, T., Morency, L.P., Robinson, P., Bulling, A.: GazeDirector: fully articulated eye gaze redirection in video. Comput. Graph. Forum 37(2), 217–225 (2018). https://doi.org/10.1111/cgf.13355

    Article  Google Scholar 

  22. Xia, W., Yang, Y., Xue, J.H., Feng, W.: Controllable continuous gaze redirection. In: Proceedings of the 28th ACM International Conference on Multimedia, pp. 1782–1790. Association for Computing Machinery, New York, NY, USA, October 2020. https://doi.org/10.1145/3394171.3413868

Download references

Acknowledgement

This work was funded by JST CREST Grant Number JPMJCR19F2, Japan. We are grateful to Associate Prof. Hiromi Morita for lending us the Tobii Pro Nano for our study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kazuya Izumi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Izumi, K. et al. (2023). A Preliminary Study on Eye Contact Framework Toward Improving Gaze Awareness in Video Conferences. In: Kurosu, M., Hashizume, A. (eds) Human-Computer Interaction. HCII 2023. Lecture Notes in Computer Science, vol 14011. Springer, Cham. https://doi.org/10.1007/978-3-031-35596-7_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35596-7_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35595-0

  • Online ISBN: 978-3-031-35596-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics