Skip to main content
Log in

Collaborative eye tracking based code review through real-time shared gaze visualization

  • Research Article
  • Published:
Frontiers of Computer Science Aims and scope Submit manuscript

Abstract

Code review is intended to find bugs in early development phases, improving code quality for later integration and testing. However, due to the lack of experience with algorithm design, or software development, individual novice programmers face challenges while reviewing code. In this paper, we utilize collaborative eye tracking to record the gaze data from multiple reviewers, and share the gaze visualization among them during the code review process. The visualizations, such as borders highlighting current reviewed code lines, transition lines connecting related reviewed code lines, reveal the visual attention about program functions that can facilitate understanding and bug tracing. This can help novice reviewers to make sense to confirm the potential bugs or avoid repeated reviewing of code, and potentially even help to improve reviewing skills. We built a prototype system, and conducted a user study with paired reviewers. The results showed that the shared real-time visualization allowed the reviewers to find bugs more efficiently.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Drury J L, Beaton E, Boiney L, Duncan M O, GreenPope R, Howland M D, Klein G L. Collaboration research for crisis management teams. Foundation and Trends in Human-Computation Interaction, 2010, 3(3): 139–212

    Article  Google Scholar 

  2. Pietinen S, Bednarik R, Tukiainen M. Shared visual attention in collaborative programming: A descriptive analysis. In: Proceedings of the 2010 ICSE Workshop on Cooperative and Human Aspects of Software Engineering. 2010, 21–24

  3. Schneider B, Pea R. Toward collaboration sensing. International Journal of Computer-Supported Collaborate Learning, 2014, 9(4): 371–395

    Article  Google Scholar 

  4. Zhang Y, Pfeuffer K, Chong M K, Alexander J, Bulling A, Gellersen H. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing, 2017, 21: 173–186

    Article  Google Scholar 

  5. Busjahn T, Schulte C, Tamm S, Bednarik R. Eye Movements in Programming Education II: Analyzing the Novice’s Gaze. 2nd ed. Berlin: Freie University, 2015

    Google Scholar 

  6. Busjahn T, Bednarik R, Begel A, Crosby M, Paterson J H, Schulte C, Sharif B, Tamm S. Eye movements in code reading: Relaxing the linear order. In: Proceedings of the 2015 IEEE 23rd International Conference on Program Comprehension. 2015, 255–265

  7. D’Angelo S, Begel A. Improving communication between pair programmers using shared gaze awareness, In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2017, 6245–6255

  8. Tomasello M. Joint attention as social cognition. In: Moore C and Dunham P J, eds. Joint Attention: Its Origins and Role in Development. Hillsdale: Lawrence Erlbaum Associates, 1995, 103–130

    Google Scholar 

  9. Bates E, Thal D, Whitesell K, Fenson L, Oakes L. Integrating language and gesture in infancy. Developmental Psychology, 1989, 25(6): 1004–1019

    Article  Google Scholar 

  10. Richardson D C, Dale R. Looking to understand: The coupling between speakers’ and listeners’ eye movements and its relationship to discourse comprehension. Cognitive Science, 2005, 29(6): 1045–1060

    Article  Google Scholar 

  11. Jermann P, Mullins D, Nussli M A, Dillenbourg P. Collaborative gaze footprints: Correlates of interaction quality. In: Proceedings of Connecting Computer-Supported Collaborative Learning to Policy and Practice. 2011, 184–191

  12. Dix A, Finlay J E, Abowd G D, Beale R. Human-Computer Interaction. 3rd ed. New Jersey: Prentice-Hall, Inc., 2003

    MATH  Google Scholar 

  13. Cherubini M, Nüssli M A, Dillenbourg P. Deixis and gaze in collaborative work at a distance (over a shared map): A computational model to detect misunderstandings. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications. 2008, 173–180

  14. Stein R, Brennan S E. Another person’s eye gaze as a cue in solving programming problems. In: Proceedings of 6th International Conference on Multimodal Interfaces. 2004, 9–15

  15. Kutt G H, Lee K, Hardacre E, Papoutsaki A. Eye-write: gaze sharing for collaborative writing. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 2019, 497–509

  16. Schneider B, Pea R. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. International Journal of Computer-Supported Collaborative Learning, 2013, 8: 375–397

    Article  Google Scholar 

  17. Newn J, Velloso E, Allison F, Abdelrahman Y, Vetere F. Evaluating real-time gaze representations to infer intentions in competitive turn-based strategy games. In: Proceedings of the Annual Symposium on Computer-Human Interaction in Play. 2017, 541–552

  18. D’Angelo S, Gergle D. An eye for design: gaze visualizations for remote collaborative work. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 2018, 349–360

  19. Williams L, Kessler R. Pair Programming Illuminated. 1st ed. Massachusetts: Addison-Wesley Longman Publishing Co., Inc., 2002

    MATH  Google Scholar 

  20. Romero P, du Boulay B, Lutz R, Cox R. The effects of graphical and textual visualisations in multi-representational debugging environments. In: Proceedings of the IEEE 2003 Symposia on Human Centric Computing Languages and Environments. 2003, 236–238

  21. Bednarik R, Tukiainen M. Temporal eye-tracking data: Evolution of debugging strategies with multiple representations. In: Proceedings of the 2008 Symposium on Eye Tracking Research and Applications. 2008, 99–102

  22. Neider M, Voss M W, Kramer A F. Coordinating spatial attention: Using shared gaze to augment search and rescue. Journal of Vision, 2010, 8(6): 1084–1084

    Article  Google Scholar 

  23. Uwano H, Nakamura M, Monden A, Matsumoto K. Analyzing individual performance of source code review using reviewers’ eye movement. In: Proceedings of the 2006 symposium on Eye tracking research and applications. 2006, 133–140

  24. Wu B, Hu Y, Ruis A R. Analyzing computational thinking in collaborative programming: A quantitative ethnography approach. Journal of Computer Assisted Learning, 2019, 35(3): 421–434

    Article  Google Scholar 

  25. Sharma K, Jermann P, Nüssli M A, Dillenbourg P. Understanding collaborative program comprehension: Interlacing gaze and dialogs. In: Proceedings of the 10th International Conference on Computer Supported Collaborative Learning. 2013, 430–437

  26. Zhu Z, Ji Q. Eye and gaze tracking for interactive graphic display. Machine Vision and Applications, 2004, 15(3): 139–148

    Article  Google Scholar 

  27. Cheng S, Fan J, Dey A. Smooth gaze: A framework for recovering tasks across devices using eye tracking. Personal and Ubiquitous Computing, 2018, 22(3): 489–501

    Article  Google Scholar 

  28. Bednarik R, Busjahn T, Schulte C (eds.). Eye Movements in Programming Education: Analyzing the Expert’s Gaze. Joensuu: University of Eastern Finland, 2014

    Google Scholar 

  29. Cheng S, Sun Z, Sun L, Yee K, Dey A K. Gaze-based annotations for reading comprehension. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2015, 1569–1572

  30. Hauser F, Schreistter S, Reuter R, Mottok J H, Gruber H, Holmqvist K, Schorr N. Code reviews in C++: Preliminary results from an eye tracking study. In: Proceedings of ACM Symposium on Eye Tracking Research and Applications. 2020, 1–5

  31. Begel A, Vrzakova H. Eye movements in code review. In: Proceedings of the Workshop on Eye Movements in Programming. 2018, 1–5

  32. D’Angelo S, Gergle D. Gazed and confused: Understanding and designing shared gaze for remote collaboration. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 2016, 2492–2496

Download references

Acknowledgements

We thank all the volunteers who took part in the user studies, and all the reviewers who wrote and provided helpful comments on previous versions of this paper. We also gratefully acknowledge the grant from National Natural Science Foundation of China (Grant Nos. 61772468, 62172368), National Key Research & Development Program of China (2016YFB1001403) and Fundamental Research Funds for the Provincial Universities of Zhejiang (RF-B2019001).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shiwei Cheng.

Additional information

Shiwei Cheng received the PhD degree in computer science and technology from Zhejiang University, China in 2009. Then, he joined the School of Computer Science and Technology at Zhejiang University of Technology, China, where he became a Professor and Vice Director of the Software Engineering Institute. He has published more than 30 papers in international journals and conferences. He served as committee member of SIGCHI China Chapter, China Computer Federation (CCF) HCI Technical Committee. His research interests include human-computer interaction, ubiquitous computing and collaborative computing.

Jialing Wang is a PhD student in School of Computer Science and Technology at Zhejiang University of Technology, China. His research interest is human-computer interaction.

Xiaoquan Shen is a master student in School of Computer Science and Technology at Zhejiang University of Technology, China. His research interests include human-computer interactionand image processing.

Yijian Chen is a master student in School of Computer Science and Technology at Zhejiang University of Technology, China. His research interests include human-computer interaction and image processing.

Anind Dey is a Professor and Dean of the Information School, University of Washington, USA, and Adjunct Professor in the Department of Human-Centered Design and Engineering. He was inducted into the ACM SIGCHI Academy for his significant contributions to the field of human-computer interaction in 2015. His research interests include the intersection of human-computer interaction, machine learning, and ubiquitous computing.

Electronic supplementary material

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cheng, S., Wang, J., Shen, X. et al. Collaborative eye tracking based code review through real-time shared gaze visualization. Front. Comput. Sci. 16, 163704 (2022). https://doi.org/10.1007/s11704-020-0422-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11704-020-0422-1

Keywords

Navigation