Skip to main content

HMD Eye-Tracking Measurement of Miniature Eye Movement Toward VR Image Navigation

  • Conference paper
  • First Online:
Universal Access in Human-Computer Interaction. User and Context Diversity (HCII 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13309))

Included in the following conference series:

  • 1665 Accesses

Abstract

We conducted preliminary eye and head mount display (HMD) movement measurements to collect primary data to create a spatio-temporal virtual reality (VR) navigation system. Furthermore, we used the eye-tracking function of the Vive Pro Eye HMD to perform eye and rotational movement measurements of the HMD when gazing at a VR marker. We compared gazing at a fixed point with randomly bouncing linear motion along with horizontal and vertical motion by determining the Hurst exponent and the anisotropy of the gaze trajectories.

Trajectories of the fixed vision and the slow marker chasing showed the Hurst exponent less than 1/2, indicating anti-persistency. In contrast, as the marker velocity increased, the displacements of gaze trajectories were stretched and showed persistency to the marker motion direction. As the marker speed decreased, the gaze trajectory expanded perpendicular to the marker motion, suggesting that the antipersistent miniature motion enhanced the collection of visual information. Users were found unconsciously superimposed a persistent motion of HMD on the gaze motion in the horizontal direction. We inferred this tendency to help to generate miniature gaze motion to collect visual information.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Hamilton, D., McKechnie, J., Edgerton, E., Wilson, C.: Immersive virtual reality as a pedagogical tool in education: a systematic literature review of quantitative learning outcomes and experimental design. J. Comput. Educ. 8(1), 1–32 (2021)

    Article  Google Scholar 

  2. Fransson, G., Holmberg, J., Westelius, C.: The challenges of using head mounted virtual reality in K-12 schools from a teacher perspective. Educ. Inf. Technol. 25(4), 3383–3404 (2020)

    Article  Google Scholar 

  3. LaViola Jr., J.J., Kruijff, E., McMahan, R.P., Bowman, D.A., Poupyrev, I.: 3D User Interfaces: Theory and Practice, 2nd edn. Addison-Wesley Professional, Boston (2017)

    Google Scholar 

  4. Hubel, D.H.: Eye, Brain, and Vision. Scientific American Library, New York (1988)

    Google Scholar 

  5. McCamy, M.B., Macknik, S.L., Martinez-Conde, S.: Different fixational eye movements mediate the prevention and the reversal of visual fading: fading prevention by fixational eye movements. J. Physiol. 592(19), 4381–4394 (2014)

    Article  Google Scholar 

  6. Leigh, R.J., Zee, D.S.: The Neurology of Eye Movement, 2nd edn. F. A. Davis, Philadelphia (1991)

    Google Scholar 

  7. Martinez-Conde, S., Macknik, S.L., Hubel, D.H.: The role of fixational eye movements in visual perception. Nat. Rev. Neurosci. 5(3), 229–240 (2004)

    Article  Google Scholar 

  8. Martinez-Conde, S., Macknik, S.L., Troncoso, X.G., Hubel, D.H.: Microsaccades: a neurophysiological analysis. Trends Neurosci. 32(9), 463–475 (2009)

    Article  Google Scholar 

  9. Martinez-Conde, S., Otero-Millan, J., Macknik, S.L.: The impact of microsaccades on vision: towards a unified theory of saccadic function. Nat. Rev. Neurosci. 14(2), 83–96 (2013)

    Article  Google Scholar 

  10. Rucci, M., Iovin, R., Poletti, M., Santini, F.: Miniature eye movements enhance fine spatial detail. Nature 447(7146), 851–854 (2007)

    Article  Google Scholar 

  11. Greschner, M., Bongard, M., Rujan, P., Ammermüller, J.: Retinal ganglion cell synchronization by fixational eye movements improves feature estimation. Nat. Neurosci. 5(4), 341–347 (2002)

    Article  Google Scholar 

  12. Ahiaar, E., Arieli, A.: Figuring space by time. Neuron 32(2), 185–201 (2001)

    Article  Google Scholar 

  13. Ahiaar, E., Arieli, A.: Seeing via miniature eye movements: a dynamic hypothesis for vision. Front. Comput. Neurosci. 6, 1–27 (2012)

    Google Scholar 

  14. Ko, H.-K., Poletti, M., Rucci, M.: Microsaccades precisely relocate gaze in a high visual acuity task. Nat. Neurosci. 13, 1549–1553 (2010)

    Article  Google Scholar 

  15. Amor, T.A., Reis, S.D.S., Campos, D., Herrmann, H.J., Andrade Jr., J.S.: Persistence in eye movement during visual search. Sci. Rep. 6, 20815 (2016)

    Google Scholar 

  16. Mergenthaler, K., Engbert, R.: Modeling the control of fixational eye movements with neurophysiological delays. Phys. Rev. Lett. 98, 138104 (2007)

    Google Scholar 

  17. Engbert, R., Mergenthaler, K., Sinn, P., Pikovsky, A.: An integrated model of fixational eye movements and microsaccades. Proc. Natl. Acad. Sci. U.S.A. 108(39), E765–E770 (2011)

    Article  Google Scholar 

  18. Herrmann, C.J.J., Metzler, R., Engbert, R.: A self-avoiding walk with neural delays as a model of fixational eye movements. Sci. Rep. 7, 12958 (2017)

    Google Scholar 

  19. Wagner, J., Stuerzlinger, W., Nedel, L.: Comparing and combining virtual hand and virtual ray pointer interactions for data manipulation in immersive analytics. IEEE Trans. Visual. Comput. Graph. 27(5), 2513–2523 (2021)

    Google Scholar 

  20. Peukert, C., Lechner, J., Pfeiffer, J., Weinhardt, C.: Intelligent invocation: towards designing context-aware user assistance systems based on real-time eye tracking data analysis. In: Davis, F.D., Riedl, R., vomBrocke, J., Léger, P.-M., Randolph, A., Fischer, T. (eds.) NeuroIS Retreat 2019. LNISO, vol. 32, pp. 73–82. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-28144-1_8

    Chapter  Google Scholar 

  21. Matthews, S., et al.: Work-in-progress-a preliminary eye tracking and HMD orientation comparison to determine focus on a cardiac auscultation training environment. In: 7-th International Conference on the Immersive Learning Research Network (iLRN) (2021)

    Google Scholar 

  22. HTC Corporation: HTC Vive Pro Eye. https://www.vive.com/us/product/vive-pro-eye/specs/. Accessed 25 Jan 2022

  23. Stein, N., et al.: A comparison of eye tracking latencies among several commercial head-mounted displays. i-Perception 12(1), 1–16 (2021)

    Google Scholar 

  24. Imaoka, Y., Flury, A., de Bruin, E.D.: Assessing saccadic eye movements with head-mounted display virtual reality technology. Front. Psych. 11, 922 (2020)

    Google Scholar 

  25. Wagner Filho, J., Stuerzlinger, W., Nedel, L.: Evaluating an immersive space-time cube geovisualization for intuitive trajectory data exploration. IEEE Trans. Visual Comput. Graph. 26(1), 514–524 (2020)

    Article  Google Scholar 

  26. Berton, F., Hoyet, L., Oliver, A.-H., Bruneau, J., Le Meur, O., Pettre, J.: Eye-gaze activity in crowds: impact of virtual reality and density. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 322–331 (1991)

    Google Scholar 

  27. Yoshimura, A., Khokhar, A., Borst, C.W.: Eye-gaze-triggered visual cues to restore attention in educational VR. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 1255–1256 (2019)

    Google Scholar 

  28. Wang, P., et al.: Head pointer or eye gaze: which helps more in MR remote collaboration? In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 1219–1220 (2019)

    Google Scholar 

  29. Khokhar, A, Yoshimura, A, Borst, C.W.: Pedagogical agent responsive to eye tracking in educational VR. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 1018–1019 (2019)

    Google Scholar 

  30. Iwase, M.: All Sky Planetarium. https://www.youtube.com/channel/UCDwhG6BRwd_m42y6Wm8C4zw/featured. Accessed 31 Jan 2022

  31. Unity, Unity Technologies. https://unity.com/. Accessed 30 Jan 2022

  32. Steam, Valve Corporation: https://store.steampowered.com/about/. Accessed 30 Jan 2022. (VIVE Developers: Eye and Facial Tracking SDK. https://developer.vive.com/resources/vive-sense/eye-and-facial-tracking-sdk/. Accessed 30 Jan 2022)

  33. Mandelbrot, B.B., Van Ness, J.W.: Fractional Brownian motions, fractional noises and applications. SIAM Rev. 10(4), 422–437 (1968)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

The authors would like to thank all the participants for their cooperation and helpful discussions. They would also like to thank Enago for English language review.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shu Matsuura .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Fujimoto, S., Iwase, M., Matsuura, S. (2022). HMD Eye-Tracking Measurement of Miniature Eye Movement Toward VR Image Navigation. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. User and Context Diversity. HCII 2022. Lecture Notes in Computer Science, vol 13309. Springer, Cham. https://doi.org/10.1007/978-3-031-05039-8_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05039-8_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05038-1

  • Online ISBN: 978-3-031-05039-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics