Elsevier

Displays

Volume 34, Issue 2, April 2013, Pages 153-164
Displays

Egocentric distance perception in large screen immersive displays

https://doi.org/10.1016/j.displa.2013.01.001Get rights and content

Abstract

Many scientists have demonstrated that compared to the real world egocentric distances in head-mounted display virtual environments are underestimated. However, distance perception in large screen immersive displays has received less attention. We investigate egocentric distance perception in a virtual office room projected using a semi-spherical, a Max Planck Institute CyberMotion Simulator cabin and a flat large screen immersive display. The goal of our research is to systematically investigate distance perception in large screen immersive displays with commonly used technical specifications. We specifically investigate the role of distance to the target, stereoscopic projection and motion parallax on distance perception. We use verbal reports and blind walking as response measures for the real world experiment. Due to the limited space in the three large screen immersive displays we use only verbal reports as the response measure for the experiments in the virtual environment. Our results show an overall underestimation of distance perception in the large screen immersive displays, while verbal estimates of distances are nearly veridical in the real world. We find that even when providing motion parallax and stereoscopic depth cues to the observer in the flat large screen immersive display, participants estimate the distances to be smaller than intended. Although stereo cues in the flat large screen immersive display do increase distance estimates for the nearest distance, the impact of the stereoscopic depth cues is not enough to result in veridical distance perception. Further, we demonstrate that the distance to the target significantly influences the percent error of verbal estimates in both the real and virtual world. The impact of the distance to the target on the distance judgments is the same in the real world and in two of the used large screen displays, namely, the MPI CyberMotion Simulator cabin and the flat displays. However, in the semi-spherical display we observe a significantly different influence of distance to the target on verbal estimates of egocentric distances. Finally, we discuss potential reasons for our results. Based on the findings from our research we give general suggestions that could serve as methods for improving the LSIDs in terms of the accuracy of depth perception and suggest methods to compensate for the underestimation of verbal distance estimates in large screen immersive displays.

Highlights

► We investigate distance perception in three large screen immersive displays (LSIDs). ► Verbal reports of egocentric distances are nearly veridical in the real world. ► Verbal reports of egocentric distances are underestimated in all three LSIDs. ► We found an effect of distance in the real and virtual worlds. ► Stereoscopic projection impacts only near (2 m) distance estimates.

Introduction

Immersive virtual environments (VEs) are generally used as a medium for simulation of a variety of real world scenarios [1], [2]. For instance, the design and assembly processes are often performed in a controlled manner using a VE instead of simulating the situation in the real world [3], [4]. VEs are also preferable for other types of applications such as training scenarios. One reason for this is that often real world simulations and training scenarios are labor-intensive for both the instructors and the actors involved in the role-playing scenarios [2]. Additionally, some real world training scenarios, such as emergency and military simulations can involve health threatening tasks, whereas the VE scenarios can be practiced without putting the trainees in danger [5].

Ideally, the VE training should provoke the user’s realistic responses, similar to user’s real world behavior. It has been shown that by providing the viewer the illusion of being present in the virtual world, one can enhance realistic responses to events simulated in realistic VEs [6], [7]. Additionally, the user of the VE simulations should perceive the 3D spatial layout, as the designer and the programmer of the virtual world intends. In order to improve the realism of the virtual worlds and the usefulness of VE technology, scientists have investigated the differences between the real and the virtual world, with respect to the sensory information provided in the virtual world, as well as with regard to the user’s perception and action [6], [7], [8].

One major difference between the real and the virtual world has been identified in the area of spatial perception. Scientists have shown that while people are able to accurately estimate egocentric distances in the real world, at least up to about 20 m [9], people underestimate egocentric distances in head-mounted display (HMD) VEs, even up to 50% [10], [11]. Egocentric distance is the absolute distance between the observer and an external point in space [12]. Estimation of egocentric distances is often used for providing information about space perception in both real and virtual worlds [9], [10], [13], [14], [15], [16], [17], [18], [19]. Most of the studies investigating egocentric distance estimates in VEs have been conducted using HMDs. Interestingly, the estimation of egocentric distances in large screen immersive displays (LSIDs) has rarely been a topic of research, even though LSIDs are also commonly used setups for immersive virtual reality (VR) [1]. The researchers investigating egocentric distance perception in LSIDs have provided inconclusive results. Several scientists found underestimation of egocentric distances in LSIDs [13], [14], [15], while Riecke et al. found accurate egocentric distance estimates in a variety of display setups, including a LSID [20].

Often LSIDs are custom made for the specific purpose of the setup (see Ni et al. [1] for detailed descriptions of various LSIDs and example applications). Thus, the technical aspects of LSIDs often vary in many ways. For instance, LSIDs can vary in shape (flat, curved or completely enclosed cave automatic virtual environment (CAVE) [21]), the number of the projection surfaces, the ability to track the position of the user and the availability of stereoscopic projection. All these specifications of the projection setup influence the ability to convey visual cues providing depth information in a natural way (see Thompson et al. [22] and Cutting et al. [16] for summary chapters). For example, focus cues are influenced by the distance between the viewer and the projection surface, the ability to track the position of the observer’s head in real-time is necessary for cues i.e. motion parallax, and the ability to dynamically warp the projected image is necessary to provide accurate cues with respect to eye-height at different natural postures, such as sitting or standing. In summary, different combinations of visual cues, resulting from the LSIDs’ technical specifications, may create a different perceptual experience of depth. However, all displays have a center of projection (CoP), which is located in front of the display in a certain distance from the center of the display (for more detailed information about the calculation of the CoP see Vishwanath et al. [23]). The CoP is the location from which the viewer can most accurately perceive the 3D layout of a virtual scene projected in a given display [24], [23], [25]. Therefore, it is helpful to use the CoP as a viewing position when evaluating depth perception in a variety of LSIDs.

We design a series of experiments in both the real world and in three LSIDs with different specifications: a custom-made semi-spherical, a Max Planck Institute CyberMotion Simulator cabin (MPI cabin) and a flat LSID. Our LSIDs have commonly used technical setups, which are representative for display setups used in state-of-the-art VR laboratories. Similar to most of the other LSIDs, ours are particularly built for the needs of their purpose. We consider that the technical specifications (i.e. shape and size) of the tested LSIDs differ from each other. Our LSIDs may also differ in many ways from the LSIDs in other facilities. For this reason we use the CoP as viewing location for each LSID and compare the distance estimations from each LSID to the distance estimations in the real world. For conducting our research we use the standard procedure for distance estimation experiments [26], [27], [28], [29]. We use a real room and a 3D replication of the real room as the visual stimuli for the experiments. For the real world experiment we use verbal reports and blind walking as response measures. Due to the limited space in the three LSIDs, which makes action-based measures difficult to conduct in a controlled manner, our participants performed verbal reports as the response measure for the LSID experiments.

The goal of our research is to evaluate the precision of egocentric distance perception over a large range of distances in several LSIDs. Considering several studies [13], [30] which suggest that distance perception might be influenced by the distance to the target, we make two important design decisions for our experiments. First, our experiments cover more distances than previous experiments, so that we can better understand the role of the distance to the target on egocentric distance estimates in LSIDs. We investigate the pattern of distance perception in each of the tested LSIDs closely. Then we compare the pattern from each LSID to the pattern of the distance estimations performed in the real world. This comparison will provide us with new general knowledge that could be beneficial for conveying the 3D spatial layout of virtual worlds projected in LSIDs more accurately. Second, since most of the studies [13], [15], [20] investigating LSID VEs do not use stereoscopic projection and motion parallax cues in their experimental setup, in one of our flat LSID we make a direct comparison between the impact of stereoscopic projection and motion parallax on egocentric distance estimates. Accessing stereoscopic projection and motion parallax cues in our flat LSID and investigating depth perception in three LSIDs will provide more insights about the impact of the display properties on distance perception in LSIDs. Our findings will extend the knowledge about the perception of egocentric distances in LSIDs. Thus, we can use the results of our research to suggest methods for improving the LSIDs in terms of the accuracy of depth perception and to compensate for the underestimation of verbal distance estimates in LSIDs.

Section snippets

Background

Several different research areas related to space perception are of great importance to our research. First, we discuss relevant research which has investigated egocentric distance perception in the real world and in HMD virtual reality setups. Then we outline the visual depth cues that are related to the technical specifications of LSIDs. Finally, we discuss what is already known about space perception in LSIDs.

Experimental design

In order to investigate the perception of egocentric distances across different LSIDs we designed a series of experiments with a between-subjects design. Altogether we had 77 participants (35 male and 42 female), who had normal or corrected to normal vision. None of them saw the real room used for this research before taking part in the experiment. Each participant took part in only one condition (in only one of the experiments) and viewed the environment binocularly. Each participant viewed

Overall results

The accuracy of egocentric distance estimation in the three LSIDs and in the real world were influenced by the viewing distance to the target (see Fig. 10). In order to evaluate the relationship of distance to the target on distance estimations more precisely, we conducted a linear regression analysis for each of the experimental conditions (real world, semi-spherical LSID, MPI cabin LSID and flat LSID). The linear regression analysis was performed on the verbal estimates of egocentric

General discussion

Our results from the three LSID VEs are consistent with other recent findings [13], [14], [15], indicating significant underestimation of distance perception compared to the actual distance, and in contradiction with Riecke et al. [20], which is the only research that found no underestimation of egocentric distances in LSIDs. The verbal estimates of egocentric distances provided in the LSIDs were on average underestimated compared to the real world. Overall the verbal distance estimations

Conclusions

We investigated egocentric distance perception in three LSIDs (semi-spherical, MPI cabin and flat) and in the real world. We found that while verbal reports of egocentric distances were veridical in the real world, they were significantly underestimated in all three LSIDs. In the flat LSID we specifically investigated the role of stereoscopic projection and motion parallax on egocentric distance estimations. Our findings indicated that stereoscopic depth cues create less underestimation but

Acknowledgments

This research was supported by WCU (World Class University) program through the National Research Foundation of Korea funded by the Ministry of Education, Science and Technology (R31-10008), the Center for Integrative Neuroscience, Tuebingen CIN2011-16, and the FP7 EU Project VR-HYPERSPACE (see vr-hyperspace.eu). The authors are grateful to Joachim Tesch, Sally Linkenauger, Michael Kerger, Trevor Dodds, Stephan Streuber, Ekaterina Volkova and Markus Leyrer for useful suggestions and discussions.

References (53)

  • S. Jayaram et al.

    Virtual assembly using virtual reality techniques

    Computer-Aided Design

    (1997)
  • T. Ni et al.

    A survey of large high-resolution display technologies, techniques, and applications

  • A.A. Rizzo et al.

    A virtual reality scenario for all seasons: the virtual classroom

    CNS Spectrum

    (2006)
  • E. van Wyk et al.

    Virtual reality training applications for the mining industry

  • W. Swartout et al.

    Toward virtual humans

    AI Magazine

    (2006)
  • M. Slater et al.

    How we experience immersive virtual environments: the concept of presence and its measurement

    Anuario de Psicologia

    (2009)
  • M. Slater et al.

    Visual realism enhances realistic response in an immersive virtual environment

    IEEE Computer Graphics and Applications

    (2009)
  • E. McManus, B. Bodenheimer, S. de la Rosa, S.S., H. Bülthoff, B. Mohler, The influence of avatar animation (self and...
  • J.M. Loomis et al.

    Visual space perception and visually directed action

    Journal of Experimental Psychology: Human Perception and Performance

    (1992)
  • J.W. Philbeck et al.

    Comparison of two indicators of perceived egocentric distance under full-cue and reduced-cue conditions

    Journal of Experimental Psychology: Human Perception and Performance

    (1997)
  • J.M. Loomis et al.

    Virtual and Adaptive Environments: Visual Perception of Egocentric Distance in Real and Virtual Environments

    (2003)
  • J.A. Da Silva

    Scales for perceived egocentric distance in a large open field: comparison of three psychophysical methods

    The American Journal of Psychology

    (1985)
  • J.M. Plumert et al.

    Distance perception in real and virtual environments

    ACM Transactions on Applied Perception

    (2005)
  • E. Klein et al.

    Measurement protocols for medium-field distance perception in large-screen immersive displays

  • T.Y. Grechkin et al.

    How does presentation method and measurement protocol affect distance estimation in real and virtual environments?

    ACM Transactions on Applied Perception

    (2010)
  • J.E. Cutting et al.

    Perceiving layout and knowing distances: the integration, relative potency and contextual use of different information about depth

    Perception of Space and Motion

    (1995)
  • B.J. Mohler et al.

    The influence of feedback on egocenteric distance judgments in real and virtual environments

  • S.H. Creem-Regehr et al.

    The influence of restricted viewing conditions on egocentric distance perception: implications for real and virtual environments

    Perception

    (2005)
  • C.S. Sahm et al.

    Throwing versus walking as indicators of distance perception in similar real and virtual environments

    ACM Transactions on Applied Perception

    (2005)
  • B.E. Riecke et al.

    Display size does not affect egocentric distance perception of naturalistic stimuli

  • C. Cruz-Neira et al.

    Surround-screen projection-based virtual reality: the design and implementation of the cave

  • W.B. Thompson, in: Fundamentals of Computer Graphics (Second): Visual Perception, Ed. A.K. Peters, Ltd., Natick, MA,...
  • D. Vishwanath et al.

    Why pictures look right when viewed from the wrong place

    Nature Neuroscience

    (2005)
  • M.S. Banks et al.

    Where should you sit to watch a movie?

    SPIE

    (2005)
  • M.S. Banks et al.

    Perception of 3-d layout in stereo displays

    Information Display

    (2009)
  • B. Thompson et al.

    Does the quality of the computer graphics matter when judging distances in visually immersive environments?

    Presence: Teleoperators and Virtual Environments

    (2004)
  • Cited by (39)

    View all citing articles on Scopus
    View full text