skip to main content
10.1145/3581783.3613908acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

Gaze Analysis System for Immersive 360° Video for Preservice Teacher Education

Published: 27 October 2023 Publication History

Abstract

Unified systems for multi-sensor devices, particularly eye-tracking in Virtual Reality (VR), are intricate and often require the listening and streaming of multichannel data. In this project, we propose a visual analysis framework for replicating a participant's viewing involvement by interpreting head movements as rotations and point-of-gaze (POG) as on-screen indicators. Our solution suggests an additional layer of system for near-real-time for processing and analyzing this multi-device data to connect with the data and enable both near-real-time or subsequent offline viewing of the entire VR eye-tracking session. Moreover, our method provides a no-batteries-need solution to create traditional eye-tracking visualization techniques. Finally, we apply three prior education technology analysis metrics: higher density gaze for students, shorter fixation time, and less fixation duration variance for students to determine expertise levels in this system. We systematically establish a ubiquitous, multi-device, eye-tracking solution to incorporate this approach. We evaluate the effectiveness of our system through a user study, using both expertise and non- expertise levels, and selectively surveying to ascertain the quality of the replicated experience and we test the system by running a real-world user study with sixty four different participants. We demonstrate the application's significance and potential to integrate prior analysis metrics using the collected data which this data collection and analysis have been approved by IRB.

References

[1]
Ioannis Agtzidis, Mikhail Startsev, and Michael Dorr. 2019. 360-degree Video Gaze Behaviour. In Proceedings of the 27th ACM International Conference on Multimedia. ACM. https://doi.org/10.1145/3343031.3350947
[2]
Kai S. Cortina, Kevin F. Miller, Ryan McKenzie, and Alanna Epstein. 2015. Where Low and High Inference Data Converge: Validation of CLASS Assessment of Mathematics Instruction Using Mobile Eye Tracking with Expert and Novice Teachers. International Journal of Science and Mathematics Education 13, 2 (Feb. 2015), 389--403. https://doi.org/10.1007/s10763-014-9610-5
[3]
Flaviu Cristian. 1989. Probabilistic clock synchronization. Distributed Computing 3, 3 (Sept. 1989), 146--158. https://doi.org/10.1007/bf01784024
[4]
Edwin S. Dalmaijer, Sebastiaan Mathôt, and Stefan Van der Stigchel. 2013. PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior Research Methods 46, 4 (nov 2013), 913--921. https://doi.org/10.3758/s13428-013-0422-2
[5]
Erwan J. David, Jesús Gutiérrez, Antoine Coutrot, Matthieu Perreira Da Silva, and Patrick Le Callet. 2018. A dataset of head and eye movements for 360° videos. In Proceedings of the 9th ACM Multimedia Systems Conference. ACM. https://doi.org/10.1145/3204949.3208139
[6]
Philippe Dessus, Olivier Cosnefroy, and Vanda Luengo. 2016. ?Keep Your Eyes on 'em all!": A Mobile Eye-Tracking Analysis of Teachers' Sensitivity to Students. In Adaptive and Adaptable Learning. Springer International Publishing, 72--84. https://doi.org/10.1007/978-3-319-45153-4_6
[7]
Heiner Deubel andWerner X. Schneider. 1996. Saccade target selection and object recognition: Evidence for a common attentional mechanism. Vision Research 36, 12 (June 1996), 1827--1837. https://doi.org/10.1016/0042-6989(95)00294-4
[8]
J. W. Dink and B Ferguson. 2015. eyetrackingR: An R Library for Eye-tracking Data Analysis. http://www.eyetrackingr.com Accessed on 02.28.2022.
[9]
Richard E. Ferdig and Karl W. Kosko. 2020. Implementing 360 Video to Increase Immersion, Perceptual Capacity, and Teacher Noticing. TechTrends 64, 6 (2020), 849--859. https://doi.org/10.1007/s11528-020-00522-3
[10]
Onur Ferhat and Fernando Vilariño. 2016. Low Cost Eye Tracking: The Current Panorama. Computational Intelligence and Neuroscience 2016 (2016), 1--14. https: //doi.org/10.1155/2016/8680541
[11]
Edward G. Freedman. 2008. Coordination of the eyes and head during visual orienting. 190, 4 (Aug. 2008), 369--387. https://doi.org/10.1007/s00221-008-1504-8
[12]
Upamanyu Ghose, Arvind A. Srinivasan,W. Paul Boyce, Hong Xu, and Eng Siong Chng. 2020. PyTrack: An end-to-end analysis toolkit for eye tracking. Behavior Research Methods 52, 6 (jun 2020), 2588--2603. https://doi.org/10.3758/s13428- 020-01392-6
[13]
Jesús Gutiérrez, Erwan David, Yashas Rai, and Patrick Le Callet. 2018. Toolbox and dataset for the development of saliency and scanpath models for omnidirectional/ 360° still images. Signal Processing: Image Communication 69 (Nov. 2018), 35--42. https://doi.org/10.1016/j.image.2018.05.003
[14]
James E. Hoffman and Baskaran Subramaniam. 1995. The role of visual attention in saccadic eye movements. Perception & Psychophysics 57, 6 (Jan. 1995), 787--795. https://doi.org/10.3758/bf03206794
[15]
Zhiming Hu, Sheng Li, Congyi Zhang, Kangrui Yi, Guoping Wang, and Dinesh Manocha. 2020. DGaze: CNN-Based Gaze Prediction in Dynamic Scenes. IEEE Transactions on Visualization and Computer Graphics 26, 5 (May 2020), 1902--1911. https://doi.org/10.1109/tvcg.2020.2973473
[16]
iMotions. 2022. Eye tracking: VR. https://imotions.com/biosensor/eye-trackingvr/ Accessed on 02.28.2022.
[17]
M Kleiner, D Brainard, Denis Pelli, A Ingling, R Murray, and C Broussard. 2007. What's new in psychtoolbox-3. Perception 36, 14 (2007), 1--16.
[18]
Matthias Kummerer, Thomas S. A. Wallis, and Matthias Bethge. 2018. Saliency Benchmarking Made Easy: Separating Models, Maps and Metrics. In Proceedings of the European Conference on Computer Vision (ECCV).
[19]
Pupil Lab. 2020. LabStreamingLayer's Documentation. https://labstreaminglayer.readthedocs.io/info/intro.html Accessed on 04.09.2021.
[20]
Chris Lenart, Yuxin Yang, Zhiqing Gu, Cheng-Chang Lu, Karl Kosko, Richard Ferdig, and Qiang Guan. 2021. GazeXR: A General Eye-Tracking System Enabling Invariable Gaze Data in Virtual Environment. In Virtual, Augmented and Mixed Reality, Jessie Y. C. Chen and Gino Fragomeni (Eds.). Springer International Publishing, Cham, 47--58.
[21]
Chris Lenart, Yuxin Yang, Zhiqing Gu, Cheng-Chang Lu, Karl Kosko, Richard Ferdig, and Qiang Guan. 2021. GazeXR: A general eye-tracking system enabling invariable gaze data in virtual environment. In International Conference on Human-Computer Interaction. Springer, 47--58.
[22]
Thomas Löwe, Michael Stengel, Emmy-Charlotte Förster, Steve Grogorick, and Marcus Magnor. 2015. Visualization and Analysis of Head Movement and Gaze Data for Immersive Video in Head-mounted Displays. In Proc. Workshop on Eye Tracking and Visualization (ETVIS), Vol. 1.
[23]
P. McManus. 2018. Bootstrapping WebSockets with HTTP/2. RFC 8441. RFC Editor. https://www.rfc-editor.org/rfc/rfc8441.txt
[24]
Matthew E Miller, Yuxin Yang, Karl Kosko, Richard Ferdig, Cheng Chang Lu, and Qiang Guan. 2020. Empeiría*: Powering Future Education Training Systems with Device Agnostic Web-VR Apps. In International Conference on Human-Computer Interaction. Springer, 287--300.
[25]
Tomas Möller and Ben Trumbore. 1997. Fast, Minimum Storage Ray-Triangle Intersection. Journal of Graphics Tools 2, 1 (Jan. 1997), 21--28. https://doi.org/10. 1080/10867651.1997.10487468
[26]
Jakob Nielsen and Kara Pernice. 2010. Eyetracking web usability. New Riders.
[27]
Alexander Plopski, Teresa Hirzle, Nahal Norouzi, Long Qian, Gerd Bruder, and Tobias Langlotz. 2023. The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended Reality. Comput. Surveys 55, 3 (April 2023), 1--39. https://doi.org/10.1145/3491207
[28]
Yuan Yuan Qian and Robert J. Teather. 2018. Look to Go. In Proceedings of the Symposium on Spatial User Interaction. ACM. https://doi.org/10.1145/3267782. 3267798
[29]
Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the symposium on Eye tracking research & applications - ETRA '00. ACM Press. https://doi.org/10.1145/355017.355028
[30]
Maike Schindler and Achim Lilienthal. 2019. Domain-specific interpretation of eye tracking data: towards a refined use of the eye-mind hypothesis for the field of geometry. Educational Studies in Mathematics 101 (05 2019). https://doi.org/10.1007/s10649-019-9878-z
[31]
Ana Serrano, Vincent Sitzmann, Jaime Ruiz-Borau, Gordon Wetzstein, Diego Gutierrez, and Belen Masia. 2017. Movie editing and cognitive event segmentation in virtual reality video. 36, 4 (July 2017), 1--12. https://doi.org/10.1145/3072959.3073668
[32]
V. Sitzmann, A. Serrano, A. Pavel, M. Agrawala, D. Gutierrez, B. Masia, and G. Wetzstein. 2018. Saliency in VR: How Do People Explore Virtual Environments? IEEE Transactions on Visualization and Computer Graphics 24, 4 (2018), 1633--1642. https://doi.org/10.1109/TVCG.2018.2793599
[33]
Benjamin W. Tatler. 2007. The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions. Journal of Vision 7, 14 (11 2007), 4--4. https://doi.org/10.1167/7.14. 4 arXiv:https://arvojournals.org/arvo/content_public/journal/jov/932846/jov-7-14-4.pdf
[34]
Tobii. 2022. Eye tracking software for behavior research - Tobii Pro Lab - Tobii. https://www.tobii.com/products/software/data-analysis-tools/tobii-pro-lab Accessed on 02.28.2022.
[35]
Takumi Toyama, Thomas Kieninger, Faisal Shafait, and Andreas Dengel. 2012. Gaze guided object recognition using a head-mounted eye tracker. In Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12. ACM Press. https://doi.org/10.1145/2168556.2168570
[36]
Niek van den Bogert, Jan van Bruggen, Danny Kostons, and Wim Jochems. 2014. First steps into understanding teachers' visual perception of classroom events. Teaching and Teacher Education 37 (Jan. 2014), 208--216. https://doi.org/10.1016/j. tate.2013.09.001
[37]
WorldViz. 2022. Virtual Reality Eye Tracking for Research | WorldViz. https: //www.worldviz.com/virtual-reality-eye-tracking-for-research-solutions Accessed on 02.28.2022.
[38]
Wenyan Yang, Yanlin Qian, Joni-Kristian Kamarainen, Francesco Cricri, and Lixin Fan. 2018. Object Detection in Equirectangular Panorama. IEEE. https://doi.org/10.1109/icpr.2018.8546070
[39]
Yalong Yang, Bernhard Jenny, Tim Dwyer, Kim Marriott, Haohui Chen, and Maxime Cordeil. 2018. Maps and Globes in Virtual Reality. Computer Graphics Forum 37 (2018).
[40]
Yifu Zhang, Peize Sun, Yi Jiang, Dongdong Yu, Zehuan Yuan, Ping Luo, Wenyu Liu, and Xinggang Wang. 2021. ByteTrack: Multi-Object Tracking by Associating Every Detection Box. arXiv preprint arXiv:2110.06864 (2021).

Cited By

View all
  • (2024)MNIST-Fraction: Enhancing Math Education with AI-Driven Fraction Detection and AnalysisProceedings of the 2024 ACM Southeast Conference10.1145/3603287.3651221(284-290)Online publication date: 18-Apr-2024
  • (2024)Nail salon dust reveals alarmingly high photoinitiator levels: Assessing occupational risksJournal of Hazardous Materials10.1016/j.jhazmat.2024.134913475(134913)Online publication date: Aug-2024

Index Terms

  1. Gaze Analysis System for Immersive 360° Video for Preservice Teacher Education

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MM '23: Proceedings of the 31st ACM International Conference on Multimedia
    October 2023
    9913 pages
    ISBN:9798400701085
    DOI:10.1145/3581783
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 October 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. education technology
    2. eye-tracking
    3. gaze analysis
    4. heat maps
    5. human-computer interaction
    6. virtual reality

    Qualifiers

    • Research-article

    Conference

    MM '23
    Sponsor:
    MM '23: The 31st ACM International Conference on Multimedia
    October 29 - November 3, 2023
    Ottawa ON, Canada

    Acceptance Rates

    Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)88
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)MNIST-Fraction: Enhancing Math Education with AI-Driven Fraction Detection and AnalysisProceedings of the 2024 ACM Southeast Conference10.1145/3603287.3651221(284-290)Online publication date: 18-Apr-2024
    • (2024)Nail salon dust reveals alarmingly high photoinitiator levels: Assessing occupational risksJournal of Hazardous Materials10.1016/j.jhazmat.2024.134913475(134913)Online publication date: Aug-2024

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media