Skip to main content
Log in

Perceptual self-position estimation based on gaze tracking in virtual reality

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

The depth perception of human visual system is divergent between virtual and real space; this depth discrepancy affects the spatial judgment of the user in a virtual space, which means the user cannot precisely locate their self-position in a virtual space. Existing localization methods ignore the depth discrepancy and only concentrate on increasing location accuracy in real space. Thus, the discrepancy always exists in virtual space, which induces visual discomfort. In this paper, a localization method based on depth perception is proposed to measure the self-position of the user in a virtual environment. Using binocular gaze tracking, this method estimates perceived depth and constructs an eye matrix by measuring gaze convergence on a target. Comparing the eye matrix and camera matrix, the method can automatically calculate the actual depth of the viewed target. Then, the difference between the actual depth and the perceived depth can be explicitly estimated without markers. The position of the virtual camera is compensated by the depth difference to obtain perceptual self-position. Furthermore, a virtual reality system is redesigned by adjusting the virtual camera position. The redesigned system makes users feel that the distance (from the user to an object) is the same in virtual and real space. Experimental results demonstrate that the redesigned system can improve the user’s visual experiences, which validate the superiority of the proposed localization method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  • Ahmed F, Cohen JD, Binder KS, Fennema CL (2010) Influence of tactile feedback and presence on egocentric distance perception in virtual environments. In: Virtual reality conference

  • Armbruester C, Wolter M, Kuhlen T, Spijkers W, Fimm B (2008) Depth perception in virtual reality: distance estimations in peri- and extrapersonal space. CyberPsychol Behav 11(1):9–15

    Article  Google Scholar 

  • Bruder G, Steinicke F, Wieland P, Lappe M (2012) Tuning self-motion perception in virtual reality with visual illusions. IEEE Trans Visual Comput Graphics 18(7):1068–1078

    Article  Google Scholar 

  • Dierkes K, Kassner M, Bulling A (2018) A novel approach to single camera, glint-free 3d eye model fitting including corneal refraction. In: Proceedings of the 2018 ACM symposium on eye tracking research & applications. ACM

  • Duchowski AT, Shivashankaraiah V, Rawls T, Gramopadhye AK, Melloy BJ, Kanki B (2000) Binocular eye tracking in virtual reality for inspection training. In: Proceedings of the 2000 symposium on Eye tracking research & applications

  • El Jamiy F, Marsh R (2019) Survey on depth perception in head mounted displays: distance estimation in virtual reality, augmented reality, and mixed reality. IET Image Proc 13(5):707–712

    Article  Google Scholar 

  • Fang W, Zheng L, Deng H, Zhang H (2017) Real-time motion tracking for mobile augmented/virtual reality using adaptive visual-inertial fusion. Sensors 17(5):1–22

    Article  Google Scholar 

  • Fernandes AS, Feiner SK (2016) Combating vr sickness through subtle dynamic field-of-view modification. In: 2016 IEEE symposium on 3D user interfaces (3DUI)

  • Hillaire S, Lécuyer A, Cozot R, Casiez G (2008) Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments. In: 2008 IEEE virtual reality conference

  • Hua Hong (2017) Enabling focus cues in head-mounted displays. Proc IEEE 105(99):805–824

    Article  Google Scholar 

  • Huang Y, Cai M, Li Z, Sato Y (2018) Predicting gaze in egocentric video by learning task-dependent attention transition. In: Proceedings of the European conference on computer vision (ECCV)

  • Interrante V, Ries B, Anderson L (2007) Seven league boots: a new metaphor for augmented locomotion through moderately large scale immersive virtual environments. In: 2007 IEEE symposium on 3D user interfaces. IEEE

  • Kar A, Corcoran P (2017) A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms. IEEE Access 5(99):16495–16519

    Article  Google Scholar 

  • Kelly JW, Hammel WW, Siegel ZD, Sjolund LA (2014) Recalibration of perceived distance in virtual environments occurs rapidly and transfers asymmetrically across scale. IEEE Trans Visual Comput Graphics 20(4):588–595

    Article  Google Scholar 

  • Kim J, Kim W, Ahn S, Kim J, Lee S (2018) Virtual reality sickness predictor: Analysis of visual-vestibular conflict and vr contents. In: 2018 tenth international conference on quality of multimedia experience (QoMEX). IEEE

  • Knapp JM, Loomis JM (2014) Limited field of view of head-mounted displays is not the cause of distance underestimation in virtual environments. Presence 13(5):572–577

    Article  Google Scholar 

  • Kunz BR, Wouters L, Smith D, Thompson WB, Creem-Regehr SH (2009) Revisiting the effect of quality of graphics on distance judgments in virtual environments: a comparison of verbal reports and blind walking. Atten Percept Psychophys 71(6):1284–1293

    Article  Google Scholar 

  • Lee Y, Shin C, Plopski A, Itoh Y, Piumsomboon T, Dey A, Lee G, Kim S, Billinghurst M (2017) Estimating gaze depth using multi-layer perceptron. In: 2017 international symposium on ubiquitous virtual reality (ISUVR). IEEE

  • Lin CJ, Woldegiorgis BH (2015) Interaction and visual performance in stereoscopic displays: a review. J Soc Inform Display 23(7):319–332

    Article  Google Scholar 

  • McAllister DF (2002) Stereo and 3-d display technologies. Encyclopedia of imaging science and technology

  • Mujahidin S, Wibirama S, Nugroho HA, Hamamoto K (2016) 3d gaze tracking in real world environment using orthographic projection. In: AIP publishing LLC

  • Nescher T, Zank M, Kunz A (2016) Simultaneous mapping and redirected walking for ad hoc free walking in virtual environments. IEEE

  • Padmanaban N, Konrad R, Stramer T, Cooper EA, Wetzstein G (2017) Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays. Proc Nat Acad Sci 114(9):2183–2188

    Article  Google Scholar 

  • Patney A, Salvi M, Kim J, Kaplanyan A, Wyman C, Benty N, Luebke D, Lefohn A (2016) Towards foveated rendering for gaze-tracked virtual reality. ACM Trans Graph (TOG) 35(6):179–189

    Article  Google Scholar 

  • Piumsomboon T, Lee G, Lindeman RW, Billinghurst M (2017) Exploring natural eye-gaze-based interaction for immersive virtual reality. In: 2017 IEEE symposium on 3D user interfaces (3DUI)

  • Pollock B, Burton M, Kelly JW, Gilbert S, Winer E (2012) The right view from the wrong location: depth perception in stereoscopic multi-user virtual environments. IEEE Trans Visual Comput Graphics 18(4):581–588

    Article  Google Scholar 

  • Renner RS, Velichkovsky BM, Helmert JR (2013) The perception of egocentric distances in virtual environments-a review. ACM Comput Surv (CSUR) 46(2):23–64

    Article  Google Scholar 

  • Shu Y, Huang YZ, Chang SH, Chen MY (2018) Do virtual reality head-mounted displays make a difference? a comparison of presence and self-efficacy between head-mounted displays and desktop computer-facilitated virtual environments. Virtual Reality 23(3):437–446

    Google Scholar 

  • Steinicke F, Bruder G, Kuhl S, Willemsen P, Lappe M, Hinrichs K (2011) Natural perspective projections for head-mounted displays. IEEE Trans Vis Comput Graphics 17(7):888–99

    Article  Google Scholar 

  • Tripathi S, Guenter B (2017) A statistical approach to continuous self-calibrating eye gaze tracking for head-mounted virtual reality systems. In: 2017 IEEE winter conference on applications of computer vision (WACV)

  • Weier M, Roth T, Hinkenjann A, Slusallek P (2018) Predicting the gaze depth in head-mounted displays using multiple feature regression. In: Acm symposium

  • Wood E, Bulling A (2014) Eyetab: Model-based gaze estimation on unmodified tablet computers. In: Symposium on eye tracking research & applications

  • Xia Z, Hwang A (2019) Self-position awareness-based presence and interaction in virtual reality. Virtual Real 24(3):255–262

    Google Scholar 

  • Zhang M, Teck Ma K, Hwee Lim J, Zhao Q, Feng J (2017) Deep future gaze: gaze anticipation on egocentric videos using adversarial networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huabiao Qin.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, H., Qin, H. Perceptual self-position estimation based on gaze tracking in virtual reality. Virtual Reality 26, 269–278 (2022). https://doi.org/10.1007/s10055-021-00553-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-021-00553-y

Keywords

Navigation