Abstract
Significant depth judgment errors are common in augmented reality. This study presents a visualization approach for improving relative depth judgments in augmented reality. The approach uses auxiliary augmented objects in addition to the main augmentation to support ordinal and interval depth judgment tasks. The auxiliary augmentations are positioned spatially near real-world objects, and the location of the main augmentation can be deduced based on the relative depth cues between the augmented objects. In the experimental part, the visualization approach was tested in the “X-ray” visualization case with a video see-through system. Two relative depth cues, in addition to motion parallax, were used between graphical objects: relative size and binocular disparity. The results show that the presence of auxiliary objects significantly reduced errors in depth judgment. Errors in judging the ordinal location with respect to a wall (front, at, or behind) and judging depth intervals were reduced. In addition to reduced errors, the presence of auxiliary augmentation increased the confidence in depth judgments, and it was subjectively preferred. The visualization approach did not have an effect on the viewing time.
- Allison, R., Gillam, B., and Vecellio, E. 2009. Binocular depth discrimination and estimation beyond interaction space. J. Vision 9, 1, 10, 1--14.Google ScholarCross Ref
- Avery, B., Sandor, C., and Thomas, B. H. 2009. Improving spatial perception for augmented reality X-ray vision. In Proceedings of the Virtual Reality Conference. IEEE, Los Alamitos, CA, 79--82.Google Scholar
- Avery, B., Thomas, B. H., and Piekarski, W. 2008. User evaluation of see-through vision for mobile outdoor augmented reality. In Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR '08) IEEE, Los Alamitos, CA, 69--72. Google ScholarDigital Library
- Bane, R. and Hollerer, T. 2004. Interactive tools for virtual X-ray vision in mobile augmented reality. In Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR '04), IEEE, Los Alamitos, CA, 231--239. Google ScholarDigital Library
- Birkfellner, W., Figl, M., Huber, K., Watzinger, F., Wanschitz, F., Hummel, J., Hanel, R., Greimel, W., Homolka, P., Ewers, R., and Bergmann, H. 2002. A head-mounted operating binocular for augmented reality visualization in medicine - Design and initial evaluation. IEEE Trans. Medical Imag. 21, 8, 991--997.Google ScholarCross Ref
- Collett, T. S. 1985. Extrapolating and interpolating surfaces in depth. In Proc. Royal Soc. London. Series B. Biol. Sci. 224, 1234, 43--56.Google Scholar
- Cutting, J. E. and Vishton, P. M. 1995. Perceiving layout and knowing distances: The integration, relative potency, and contextual use of different information about depth. In Handbook of Perception and Cognition, W. Epstein and S. Rogers Eds., vol. 5, Academic Press, San Diego, CA., 69--117.Google Scholar
- De Silva, V., Fernando, A., Worrall, S., Arachchi, H. K., and Kondoz, A. 2011. Sensitivity analysis of the human visual system for depth cues in stereoscopic 3-D displays. IEEE Trans. Multimedia 13, 3, 498--506. Google ScholarDigital Library
- Dey, A., Cunningham, A., and Sandor, C. 2010. Evaluating depth perception of photorealistic mixed reality visualizations for occluded objects in outdoor environments. In Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology (VRST '10). 211--218. Google ScholarDigital Library
- Drascic, D. and Milgram, P. 1991. Pointing accuracy of a virtual stereographic pointer in a real stereoscopic video world. In Proceedings of the SPIE Stereoscopic Displays and Applications II. Vol. 1457, 623--626.Google Scholar
- Drascic, D. and Milgram, P. 1996. Perceptual Issues in augmented reality. In Proceedings of the SPIE Stereoscopic Displays and Applications. Vol. 2653, 123--134.Google Scholar
- Ellis, S. R. and Menges, B. M. 1998. Localization of virtual objects in the near visual field. Hum. Factors 40, 3, 415--431.Google ScholarCross Ref
- Erkelens, C. J. and Collewijn, H. 1985. Motion perception during dichoptic viewing of moving random-dot stereograms. Vision Res. 25, 4, 583--588.Google ScholarCross Ref
- Faubert, J. 2001. Motion parallax, stereoscopy, and the perception of depth: Practical and theoretical issue. In Proceedings of the SPIE on Three-Dimensional Video and Display: Devices and Systems. 168--191.Google ScholarCross Ref
- Feiner, S., Webster, A., Krueger, T., MacIntyre, B., and Keller, E. 1995. Architectural Anatomy. Presence: Teleoper. Virtual Environ. 4, 3, 318--325.Google ScholarDigital Library
- Foley, J. M. 1991. Stereoscopic distance perception. In Pictorial Communication in Real and Virtual Environments, S. Ellis et al. Eds., Taylor&Francis. Google ScholarDigital Library
- Furmanski, C., Azuma, R., and Daily, M. 2002. Augmented-reality visualizations guided by cognition: perceptual heuristics for combining visible and obscured information. In Proceedings of the International Symposium on Mixed and Augmented Reality. IEEE, Los Alamitos, CA, 215--225. Google ScholarDigital Library
- Gilinsky, A. S. 1951. Perceived size and distance in visual space. Psychol. Rev. 58, 460--482.Google ScholarCross Ref
- Grechkin, T. Y., Nguyen, T. D., Plumert, J. M., Cremer, J. F., and Kearney, J. K. 2010. How does presentation method and measurement protocol affect distance estimation in real and virtual environments? ACM Trans. Appl. Percept. 7, 4, 1--18. Google ScholarDigital Library
- Häkkinen, J. and Nyman, G. 1996. Depth asymmetry in Da Vinci stereopsis. Vision Res. 36, 23, 3815--3819.Google ScholarCross Ref
- Howard, H. J. 1919. A test for the judgment of distance. Trans. Amer. Ophthalmological Soc. 17, 195--235.Google Scholar
- Hubona, G. S., Wheeler, P. N., Shirah, G. W., and Brandt, M. 1999. The relative contributions of stereo, lighting, and background scenes in promoting 3D depth visualization. ACM Trans. Comput.-Hum. Interact. 6, 3, 214--242. Google ScholarDigital Library
- Image Engineering. 2012. LED-PANEL. http://www.imageengineering.de/index.php?option=com_content&view=article&id= 76&Itemid=68.Google Scholar
- Johnston, E. B. 1991. Systematic distortions of shape from stereopsis. Vision Res. 31, 7--8, 1351--1360.Google ScholarCross Ref
- Jurgens, V., Cockburn, A., and Billinghurst, M. 2006. Depth cues for augmented reality stakeout. In Proceedings of the 7th ACM SIGCHI New Zealand Chapter International Conference on Computer-Human Interaction: Design Centered HCI (CHINZ '06). ACM, New York, 117--124. Google ScholarDigital Library
- Kanizsa, G. 1979. Organization in Vision: Essays on Gestalt Perception. Praeger, New York.Google Scholar
- Kruijff, E., Swan, J., and Feiner, S. 2010. Perceptual issues in augmented reality revisited. In Proceedings of the 9th IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, Los Alamitos, CA, 3--12.Google Scholar
- Kytö , M., Häkkinen, J., and Oittinen, P. 2011. Stereoscopic viewing facilitates the perception of crowds. In Proceedings of the IEEE International Conference on Advanced Video and Signal-Based Surveillance (AVSS). IEEE, Los Alamitos, 49--53. Google ScholarDigital Library
- Kytö, M., Nuutinen, M., and Oittinen, P. 2011. Method for measuring stereo camera depth accuracy based on stereoscopic vision. In SPIE/IS&T Electronic Imaging Three-Dimensional Imaging, Interaction, and Measurement. Vol. 7864, 1--9.Google Scholar
- Lambooij, M., IJsselsteijn, W., Fortuin, M., and Heynderickx, I. 2009. Visual discomfort and visual fatigue of stereoscopic displays: A review. J. Imag. Sci. Technol. 53, 3, 030201--14.Google ScholarCross Ref
- Landy, M. S., Maloney, L. T., Johnston, E. B., and Young, M. 1995. Measurement and modeling of depth cue combination: in defense of weak fusion. Vision Res. 35, 3, 389--412.Google ScholarCross Ref
- Liu, Y., Bovik, A. C., and Cormack, L. K. 2008. Disparity statistics in natural scenes. J. Vision 8, 11, 19, 1--14.Google ScholarCross Ref
- Livingston, M., Zhuming, A., and Decker, J. 2009. A user study towards understanding stereo perception in head-worn augmented reality displays. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality. IEEE, Los Alamitos, CA, 53--56. Google ScholarDigital Library
- Livingston, M. A., Swan, J. E., II., Gabbard, J. L., Höllerer, T. H., Hix, D., Julier, S. J., Baillot, Y., and Brown, D. 2003. Resolving multiple occluded layers in augmented reality. In Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality. IEEE, Los Alamitos, CA, 56--65. Google ScholarDigital Library
- Loomis, J. M., da Silva, J. A., Fujita, N., and Fukusima, S. S. 1992. Visual space perception and visually directed action. J. Experiment. Psychol. Human Percept. Perform. 18, 4, 906--921.Google ScholarCross Ref
- McCandless, J. W., Ellis, S. R., and Adelstein, B. D. 2000. Localization of a time-delayed, monocular virtual object superimposed on a real environment. Presence 9, 1, 15--24. Google ScholarDigital Library
- McKee, S. P. and Taylor, D. G. 2010. The precision of binocular and monocular depth judgments in natural settings. J. Vision 10, 5, 1--13.Google ScholarCross Ref
- Nagata, S. 1991. How to reinforce perception of depth in single two-dimensional pictures. In Pictorial Communication in Virtual and Real Environments, S. R. Ellis Eds., Taylor&Francis, 527--544. Google ScholarDigital Library
- Nakayama, K. 1996. Binocular visual surface perception. In Proc. Nat. Acad. Sci. 93, 2, 634--639.Google ScholarCross Ref
- Nakayama, K., Shimojo, S., and Silverman, G. H. 1989. Stereoscopic depth: Its relation to image segmentation, grouping, and the recognition of occluded objects. Perception 18, 1, 55--68.Google ScholarCross Ref
- Napieralski, P. E., Altenhoff, B. M., Bertrand, J. W., Long, L. O., Babu, S. V., Pagano, C. C., Kern, J., and Davis, T. A. 2011. Near-field distance perception in real and virtual environments using both verbal and action responses. ACM Trans. Appl. Percept. 8, 3, 1--19. Google ScholarDigital Library
- Ooi, T. L., Wu, B., and He, Z. J. 2001. Distance determined by the angular declination below the horizon. Nature 414, 6860, 197--200.Google Scholar
- OpenCV. 2011. Version 2.3. http://opencv.willowgarage.com/.Google Scholar
- O'Shea, R. P., Blackburn, S. G., and Ono, H. 1994. Contrast as a depth cue. Vision Res. 34, 12 , 1595--1604.Google Scholar
- Palmisano, S., Gillam, B., Govan, D. G., Allison, R. S., and Harris, J. M. 2010. Stereoscopic perception of real depths at large distances. J. Vision 10, 6, 19, 1--16.Google ScholarCross Ref
- Peterson, S., Axholt, M., and Ellis, S. R. 2008. Managing visual clutter: A generalized technique for label segregation using stereoscopic disparity. In IEEE Virtual Reality, 169--176.Google Scholar
- Peterson, S. D., Axholt, M., and Ellis, S. R. 2009. Objective and subjective assessment of stereoscopically separated labels in augmented reality. Comput. Graph. 33, 1, 23--33. Google ScholarDigital Library
- Philbeck, J. W. and Loomis, J. M. 1997. Comparison of two indicators of perceived egocentric distance under full-cue and reduced-cue conditions. J. Exper. Psychol. Hum. Percept. Perform. 23, 1, 72--85.Google ScholarCross Ref
- Polys, N. F., Bowman, D. A., and North, C. 2011. The role of depth and gestalt cues in information-rich virtual environments. Int. J. Hum.-Comput. Stud. 69, 1--2 , 30--51. Google ScholarDigital Library
- Ramachandran, V. S. and Cavanagh, P. 1985. Subjective contours capture stereopsis. Nature 317, 527--530.Google ScholarCross Ref
- Reinhart, W. F. 1990. Comparison of depth cues for relative depth judgments. In Proceedings of the. SPIE on Stereoscopic Displays and Applications. Vol. 1256, 12--21.Google ScholarCross Ref
- Reinhart, W. F. 1991. Depth cueing for visual search and cursor positioning. In Proceedings of the SPIE on Stereoscopic Displays and Applications. Vol. 1457, 221--232.Google ScholarCross Ref
- Rogers, B. J. and Collett, T. S. 1989. The appearance of surfaces specified by motion parallax and binocular disparity. Quart. J. Hum. Exper. Psychol. 41, 4, 697--717.Google ScholarCross Ref
- Rolland, J. P. and Fuchs, H. 2000. Optical versus video see-through head-mounted displays in medical visualization. Presence 9, 3, 287--309. Google ScholarDigital Library
- Sandor, C., Cunningham, A., Dey, A., and Mattila, V. V. 2010. An augmented reality X-ray system based on visual saliency. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality. IEEE, Los Alamitos, CA, 27--36.Google Scholar
- Sielhorst, T., Bichlmeier, C., Heining, S., and Navab, N. 2006. Depth perception - A major issue in medical AR: Evaluation study by twenty surgeons. In Proceedings of the International Conference on Medical Image Computing and Computer Assisted Intervention. 364--372. Google ScholarDigital Library
- Singh, G. J. S., II and Jones, J. 2010. Depth judgment measures and occluding surfaces in near-field augmented reality. In Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization. Vol. 1, 149--156. Google ScholarDigital Library
- Sugano, N., Kato, H., and Tachibana, K. 2003. The effects of shadow representation of virtual objects in augmented reality. In Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality. IEEE, Los Alamitos, CA, 76--83. Google ScholarDigital Library
- Sundet, J. M. 1978. Effects of colour on perceived depth: Review of experiments and evaluation of theories. Scandinavian J. Psychol. 19, 1, 133--143.Google ScholarCross Ref
- Suzuki, S. and Be, K. 1985. Topological structural analysis of digitized binary images by border following. Comput. Vision, Graph. Image Process. 30, 1, 32--46.Google ScholarCross Ref
- Swan, J. and Livingston, M. 2006. A perceptual matching technique for depth judgments in optical, see-through augmented reality. In Virtual Reality. 19--26. Google ScholarDigital Library
- Takashi, S., Kim, J., Hoffman, D. M., and Banks, M. S. 2011. The zone of comfort : Predicting visual discomfort with stereo displays. J. Vision 11, 8, 1--29.Google Scholar
- Takeichi, H., Watanabe, T., and Shimojo, S. 1992. Illusory occluding contours and surface formation by depth propagation. Perception 21, 2, 177--184.Google ScholarCross Ref
- Tsuda, T., Yamamoto, H., Kameda, Y., and Ohta, Y. 2005. Visualization methods for outdoor see-through vision. In Proceedings of the International Conference on Augmented Tele-Existence. ACM, New York, 62--69. Google ScholarDigital Library
- Ware, C. 2004. Information Visualization: Perception for Design. Morgan Kaufmann, San Francisco, CA. Google ScholarDigital Library
- Willemsen, P., Gooch, A. A., Thompson, W. B., and Creem-Regehr, S. H. 2008. Effects of stereo viewing conditions on distance perception in virtual environments. Presence: Teleoper. Virtual Environ. 17, 1, 91--101. Google ScholarDigital Library
- Wither, J. and Hollerer, T. 2005. Pictorial depth cues for outdoor augmented reality. In Proceedings of the IEEE International Symposium on Wearable Computers. IEEE, Los Alamitos, CA, 92--99. Google ScholarDigital Library
- Yeh, Y.-Y. and Silverstein, L. 1990. Limits of fusion and depth judgment in stereoscopic color displays. Hum. Factors 32, 1, 45--60. Google ScholarDigital Library
- Zhai, S., Buxton, W., and Milgram, P. 1996. The partial-occlusion effect: Utilizing semitransparency in 3D human-computer interaction. ACM Trans. Comput. Hum. Interact. 3, 3, 254--284. Google ScholarDigital Library
Index Terms
- Improving relative depth judgments in augmented reality with auxiliary augmentations
Recommendations
The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception
APGV '08: Proceedings of the 5th symposium on Applied perception in graphics and visualizationAs the use of virtual and augmented reality applications becomes more common, the need to fully understand how observers perceive spatial relationships grows more critical. One of the key requirements in engineering a practical virtual or augmented ...
Visuotactile integration for depth perception in augmented reality
ICMI '16: Proceedings of the 18th ACM International Conference on Multimodal InteractionAugmented reality applications using stereo head-mounted displays are not capable of perfectly blending real and virtual objects. For example, depth in the real world is perceived through cues such as accommodation and vergence. However, in stereo head-...
Perceiving depth: optical versus video see-through
VRST '16: Proceedings of the 22nd ACM Conference on Virtual Reality Software and TechnologyHead-Mounted Displays (HMDs) and similar 3D visualization devices are becoming ubiquitous. Going a step forward, HMD see-through systems bring virtual objects to real world settings, allowing augmented reality to be used in complex engineering ...
Comments