Abstract
We conducted preliminary eye and head mount display (HMD) movement measurements to collect primary data to create a spatio-temporal virtual reality (VR) navigation system. Furthermore, we used the eye-tracking function of the Vive Pro Eye HMD to perform eye and rotational movement measurements of the HMD when gazing at a VR marker. We compared gazing at a fixed point with randomly bouncing linear motion along with horizontal and vertical motion by determining the Hurst exponent and the anisotropy of the gaze trajectories.
Trajectories of the fixed vision and the slow marker chasing showed the Hurst exponent less than 1/2, indicating anti-persistency. In contrast, as the marker velocity increased, the displacements of gaze trajectories were stretched and showed persistency to the marker motion direction. As the marker speed decreased, the gaze trajectory expanded perpendicular to the marker motion, suggesting that the antipersistent miniature motion enhanced the collection of visual information. Users were found unconsciously superimposed a persistent motion of HMD on the gaze motion in the horizontal direction. We inferred this tendency to help to generate miniature gaze motion to collect visual information.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Hamilton, D., McKechnie, J., Edgerton, E., Wilson, C.: Immersive virtual reality as a pedagogical tool in education: a systematic literature review of quantitative learning outcomes and experimental design. J. Comput. Educ. 8(1), 1–32 (2021)
Fransson, G., Holmberg, J., Westelius, C.: The challenges of using head mounted virtual reality in K-12 schools from a teacher perspective. Educ. Inf. Technol. 25(4), 3383–3404 (2020)
LaViola Jr., J.J., Kruijff, E., McMahan, R.P., Bowman, D.A., Poupyrev, I.: 3D User Interfaces: Theory and Practice, 2nd edn. Addison-Wesley Professional, Boston (2017)
Hubel, D.H.: Eye, Brain, and Vision. Scientific American Library, New York (1988)
McCamy, M.B., Macknik, S.L., Martinez-Conde, S.: Different fixational eye movements mediate the prevention and the reversal of visual fading: fading prevention by fixational eye movements. J. Physiol. 592(19), 4381–4394 (2014)
Leigh, R.J., Zee, D.S.: The Neurology of Eye Movement, 2nd edn. F. A. Davis, Philadelphia (1991)
Martinez-Conde, S., Macknik, S.L., Hubel, D.H.: The role of fixational eye movements in visual perception. Nat. Rev. Neurosci. 5(3), 229–240 (2004)
Martinez-Conde, S., Macknik, S.L., Troncoso, X.G., Hubel, D.H.: Microsaccades: a neurophysiological analysis. Trends Neurosci. 32(9), 463–475 (2009)
Martinez-Conde, S., Otero-Millan, J., Macknik, S.L.: The impact of microsaccades on vision: towards a unified theory of saccadic function. Nat. Rev. Neurosci. 14(2), 83–96 (2013)
Rucci, M., Iovin, R., Poletti, M., Santini, F.: Miniature eye movements enhance fine spatial detail. Nature 447(7146), 851–854 (2007)
Greschner, M., Bongard, M., Rujan, P., Ammermüller, J.: Retinal ganglion cell synchronization by fixational eye movements improves feature estimation. Nat. Neurosci. 5(4), 341–347 (2002)
Ahiaar, E., Arieli, A.: Figuring space by time. Neuron 32(2), 185–201 (2001)
Ahiaar, E., Arieli, A.: Seeing via miniature eye movements: a dynamic hypothesis for vision. Front. Comput. Neurosci. 6, 1–27 (2012)
Ko, H.-K., Poletti, M., Rucci, M.: Microsaccades precisely relocate gaze in a high visual acuity task. Nat. Neurosci. 13, 1549–1553 (2010)
Amor, T.A., Reis, S.D.S., Campos, D., Herrmann, H.J., Andrade Jr., J.S.: Persistence in eye movement during visual search. Sci. Rep. 6, 20815 (2016)
Mergenthaler, K., Engbert, R.: Modeling the control of fixational eye movements with neurophysiological delays. Phys. Rev. Lett. 98, 138104 (2007)
Engbert, R., Mergenthaler, K., Sinn, P., Pikovsky, A.: An integrated model of fixational eye movements and microsaccades. Proc. Natl. Acad. Sci. U.S.A. 108(39), E765–E770 (2011)
Herrmann, C.J.J., Metzler, R., Engbert, R.: A self-avoiding walk with neural delays as a model of fixational eye movements. Sci. Rep. 7, 12958 (2017)
Wagner, J., Stuerzlinger, W., Nedel, L.: Comparing and combining virtual hand and virtual ray pointer interactions for data manipulation in immersive analytics. IEEE Trans. Visual. Comput. Graph. 27(5), 2513–2523 (2021)
Peukert, C., Lechner, J., Pfeiffer, J., Weinhardt, C.: Intelligent invocation: towards designing context-aware user assistance systems based on real-time eye tracking data analysis. In: Davis, F.D., Riedl, R., vomBrocke, J., Léger, P.-M., Randolph, A., Fischer, T. (eds.) NeuroIS Retreat 2019. LNISO, vol. 32, pp. 73–82. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-28144-1_8
Matthews, S., et al.: Work-in-progress-a preliminary eye tracking and HMD orientation comparison to determine focus on a cardiac auscultation training environment. In: 7-th International Conference on the Immersive Learning Research Network (iLRN) (2021)
HTC Corporation: HTC Vive Pro Eye. https://www.vive.com/us/product/vive-pro-eye/specs/. Accessed 25 Jan 2022
Stein, N., et al.: A comparison of eye tracking latencies among several commercial head-mounted displays. i-Perception 12(1), 1–16 (2021)
Imaoka, Y., Flury, A., de Bruin, E.D.: Assessing saccadic eye movements with head-mounted display virtual reality technology. Front. Psych. 11, 922 (2020)
Wagner Filho, J., Stuerzlinger, W., Nedel, L.: Evaluating an immersive space-time cube geovisualization for intuitive trajectory data exploration. IEEE Trans. Visual Comput. Graph. 26(1), 514–524 (2020)
Berton, F., Hoyet, L., Oliver, A.-H., Bruneau, J., Le Meur, O., Pettre, J.: Eye-gaze activity in crowds: impact of virtual reality and density. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 322–331 (1991)
Yoshimura, A., Khokhar, A., Borst, C.W.: Eye-gaze-triggered visual cues to restore attention in educational VR. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 1255–1256 (2019)
Wang, P., et al.: Head pointer or eye gaze: which helps more in MR remote collaboration? In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 1219–1220 (2019)
Khokhar, A, Yoshimura, A, Borst, C.W.: Pedagogical agent responsive to eye tracking in educational VR. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces, pp. 1018–1019 (2019)
Iwase, M.: All Sky Planetarium. https://www.youtube.com/channel/UCDwhG6BRwd_m42y6Wm8C4zw/featured. Accessed 31 Jan 2022
Unity, Unity Technologies. https://unity.com/. Accessed 30 Jan 2022
Steam, Valve Corporation: https://store.steampowered.com/about/. Accessed 30 Jan 2022. (VIVE Developers: Eye and Facial Tracking SDK. https://developer.vive.com/resources/vive-sense/eye-and-facial-tracking-sdk/. Accessed 30 Jan 2022)
Mandelbrot, B.B., Van Ness, J.W.: Fractional Brownian motions, fractional noises and applications. SIAM Rev. 10(4), 422–437 (1968)
Acknowledgments
The authors would like to thank all the participants for their cooperation and helpful discussions. They would also like to thank Enago for English language review.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Fujimoto, S., Iwase, M., Matsuura, S. (2022). HMD Eye-Tracking Measurement of Miniature Eye Movement Toward VR Image Navigation. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. User and Context Diversity. HCII 2022. Lecture Notes in Computer Science, vol 13309. Springer, Cham. https://doi.org/10.1007/978-3-031-05039-8_14
Download citation
DOI: https://doi.org/10.1007/978-3-031-05039-8_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-05038-1
Online ISBN: 978-3-031-05039-8
eBook Packages: Computer ScienceComputer Science (R0)