skip to main content
10.1145/3480433.3480434acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaivrConference Proceedingsconference-collections
research-article

Comparison of Differences between Human Eye Imaging and HMD Imaging

Published: 08 November 2021 Publication History

Abstract

As virtual reality (VR) headsets with head-mounted displays (HMD) have attracted more and more attention, new research questions have emerged. In virtual reality games, virtual reality movies or virtual reality simulations, maintaining a high degree of depth perception and minimizing visual discomfort is essential to improve the overall user experience. Proprioceptive cues, monocular cues, and binocular cues constitute the depth cues of visual perception, which accurately provide the depth information of objects in personal space. Proprioceptive cues include convergence and adjustment. Quite a few studies have proved that convergence and accommodation in VE exacerbate uncomfortable feelings. Monocular cues include occlusion, relative size of objects, texture fineness, variations of light and the shadows, optical thickness, perspective, etc. Monocular clues produce psychological depth cues, which can be affected by the texture mapping of objects and environment settings in the virtual environment. The binocular vision of the human eye enables users to perceive the depth of different objects in space. When both eyes observe the same target object in reality, the images formed on the retina are different due to the different lateral positions of the two eyes. The stimulus received by the retina is processed in the brain to generate stereo vision. The perception of objects generated by binocular vision is influenced by depth cues in environment. The difference in depth perception of virtual information leads to the separation of spatial perception, which makes it difficult to establish relevance with the reality operation mode. In this paper, the physiological characteristics of human eyes are studied, and visual environment provided by head-mounted displays (HMD) equipment are summarized. The limitations of visual images provided by HMD are obtained by comparing the differences. We discussed imaging mechanism of the stereo vision in HMD. By comparing the differences of stereo vision generated in HMD images and human eyes, this paper is aimed to provide theoretical suggestions for building a more realistic and immersive environment in VR.

References

[1]
I.D. Bishop and B. 2003.Rohrmann. Subjective responses to simulated and real environments: a comparison, Landsc Urban Plan, Volume 65 (4), pages 261-277.
[2]
C.A. Curcio, K.R. Sloan, R.E. Kalina, A.E. 1900. Hendrickson. Human photoreceptor topography. Comp. Neurol., Volume 292, pages 497-523.
[3]
M. Czerwinski, D.S. Tan, G.G. Robertson. Women take a wider view. In CHI’02: proceedings of the SIGCHI conference on human factors in computing systems, ACM, New York (NY, USA), pages 195-202, 2002.
[4]
Z. Chen, X. Guo, S. Li, Y. Yang, J. Yu. 2020. Deep eyes: Joint depth inference using monocular and binocular cues. Neurocomputing, ISSN 0925-2312.
[5]
G. K. Edgar. 2007. Accommodation, cognition, and virtual image displays: A review of the literature. Displays, Volume 28, Issue 2, pages 45-59, ISSN 0141-9382.
[6]
H. Frenz, M. Lappe, M. Kolesnik, T. Buhrmann. 2007. Estimation of travel distance from visual motion in virtual environments. Volume 4 (1) Article No. 3, ACM Trans. Appl. Percept. (TAP).
[7]
J. J. Gallimore, N.G. Brannon, F.R. Patterson. 1998. The effects of field-of-view on pilot head movement during low level flight. In Proceedings of the human factors and ergonomics society 42nd annual meeting, pages 6–10.
[8]
H. Hua.2017. Enabling focus cues in head-mounted displays. Proc. IEEE, Volume105, pages 805-824.
[9]
I. Iatsun, M. C. Larabi, C. Fernandez-Maloigne. 2015. A visual attention model for stereoscopic 3D images using monocular cues. In Signal Processing: Image Communication, Volume 38, pages 70-83, ISSN 0923-5965.
[10]
I. Iehisa, M. Ayaki, K. Tsubota, K. Negishi. 2020. Factors affecting depth perception and comparison of depth perception measured by the three-rods test in monocular and binocular vision,. Heliyon, Volume 6, Issue 9, e04904,ISSN 2405-8440.
[11]
J. Iskander, M. Hossny, S. Nahavandi. 2019. Using biomechanics to investigate the effect of VR on eye vergence system. Applied Ergonomics, Volume 81,102883, ISSN 0003-6870.
[12]
Leigh, R. John and Zee, David S. 2015. The Neurology of Eye Movements. vol. 90, Oxford University Press, USA.
[13]
K.H.E. Kroemer, H.B. Kroemer, K.E. Kroemer-Elbert. 1993. Ergonomics: how to design for ease and efficiency (1st ed.), Prentice Hall.
[14]
C. J. Lin, R. Widyaningrum,2018. The effect of parallax on eye fixation parameter in projection-based stereoscopic displays, Applied Ergonomics, Volume 69, pages 10-16, ISSN 0003-6870.
[15]
H. Li and C. Nguyen. 2019. Perspective-Consistent Multifocus Multiview 3D Reconstruction of Small Objects. Digital Image Computing: Techniques and Applications (DICTA), Perth, Australia, pages 1-8.
[16]
J. Lee, X. Xia, C. OW, F. Chua, Y. Guan. 2020. VEGO: A novel design towards customizable and adjustable head-mounted display for VR. Virtual Reality & Intelligent Hardware, Volume 2, Issue 5, pages 443-453, ISSN 2096-5796.
[17]
M. Lambooij, W. IJsselsteijn, D. G. Bouwhuis and I. Heynderickx. 2011. "Evaluation of Stereoscopic Images: Beyond 2D Quality," in IEEE Transactions on Broadcasting, vol. 57, no. 2, pages 432-444.
[18]
M. Lappe, M. Jenkin, L.R. Harris. 2007. Travel distance estimation from visual motion by leaky path integration. Exp. Brain Res., Volume 180, pages 35-48.
[19]
A. Murgia, P.M. Sharkey. 2009. Estimation of distances in virtual environments using size constancy. Virtual. Real, Volume 1, pages 67–74.
[20]
Z. M. Elias, U. M. Batumalai, A. N. H. Azmi. 2009. Virtual reality games on accommodation and convergence. Applied Ergonomics, Volume 81, 102879, ISSN 0003-6870.
[21]
M. E. Ono, J. Rivest, H. Ono. 1986. Depth perception as a function of motion parallax and absolute-distance information. Exp. Psychol [Hum Percept], Volume12 (3), pages 331.
[22]
M. Paquier, N. Côté, F. Devillers, V. Koehl. 2016. Interaction between auditory and visual perceptions on distance, estimations in a virtual environment, Applied Acoustics, Volume 105, pages 186-199.
[23]
E.A. Rossi, A. Roorda. 2010. The relationship between visual resolution and cone spacing in the human fovea. Nat. Neurosci., Volume 13, pages 156-157.
[24]
F.P. Redlick, M. Jenkin, L.R. HarrisHuman. 2001. Can use optic flow to estimate distance of travel? Vision Res., Volume 41 , pages 213-219.
[25]
A. Somrak, I. Humar, M. Shamim Hossain, M. F. Alhamid, M. A. Hossain, J. Guna. 2019. Estimating VR Sickness and user experience using different HMD technologies: An evaluation study, In Future Generation Computer Systems, Volume 94, pages 302-316.
[26]
R. Smith, A. Day, T. Rockall, K. Ballard, M. Bailey, I. Jourdan. 2006. Advanced stereoscopic projection technology significantly improves novice performance of minimally invasive surgical skills. Surg. Endosc., Volume 26, pages 1522-1527.
[27]
G. Tan, T. Zhan, Y.H. Lee, J. Xiong, S.T. Wu. 2018. Polarization-multiplexed multi-plane display. Opt. Lett.,Volume 43, pages 5651-5654.
[28]
G. Tan, Y.H. Lee, T. Zhan, J. Yang, S. Liu, D. Zhao, S.T. Wu. 2008. Foveated imaging for near-eye displays. Opt. Express, Volume 26, pages 25076-25085.
[29]
B. Wheelwright, Y. Sulai, Y. Geng, S. Luanava, S. Choi, W. Gao, J. Gollier. 2018. Field of view: not just a number. Digital Optics for Immersive Displays,Volume 10676, pages 604-607.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AIVR 2021: 2021 5th International Conference on Artificial Intelligence and Virtual Reality (AIVR)
July 2021
134 pages
ISBN:9781450384148
DOI:10.1145/3480433
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 November 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. VR experience
  2. depth cue
  3. depth perception
  4. visual imaging
  5. visual perception

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

AIVR 2021

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 93
    Total Downloads
  • Downloads (Last 12 months)20
  • Downloads (Last 6 weeks)1
Reflects downloads up to 10 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media