ABSTRACT
Augmented reality (AR) adds virtual graphics, sounds or data to a real-world environment. Future Head-Up Displays in vehicles will enable AR images to be presented at varying depths, potentially enabling additional cues to be provided to drivers to facilitate task performance. In order to correctly position such AR imagery, it is necessary to know at what point the virtual image is discriminable in depth from a real-world object. In a two-alternative forced-choice psychophysical depth judgment task, 40 observers judged if an AR image (a green diamond) appeared in front or behind a static 'pedestrian' target. Depth thresholds for the AR image were tested with the pedestrian target at 5m, 10m, 20m and 25m and at six locations relative to the pedestrian. The AR image was presented at different heights in the visual field, (above, middle and below the real-world target) and across the horizontal plane (left, middle, right of the real-world target). Participants were more likely to report that the AR image was presented in front of the target rather than behind. Inconsistent with previous findings, no overall effects of height or horizontal position were found. Depth thresholds scaled with distance, with larger thresholds at further distances. Findings also showed large individual differences and slow response times (above 2.5s average), suggesting of difficulties judging AR in depth. Recommendations are made regarding where a HUD image should be located in depth if a designer wishes users to reliably perceive the image to be in front/alongside or behind a real-world object.
- Tufano, D. R. (1997). Automotive HUDs: the overlooked safety issues, Human Factors, 39, 303--311.Google ScholarCross Ref
- Harrison, A. (1994). Head-up displays for automotive applications (Report No. UMTRI-9410). Ann Arbor, MI: The University of Michigan Transportation Research Institute. ACM. How to Classify WorksGoogle Scholar
- Poitschke, T. Ablassmeier, M and Rigoll, G. (2008) Contact-analog information representation in an automotive head-up display, in Proc. ETRA 2008. Google ScholarDigital Library
- Gabbard, J. Fitch, G and Kim. H (2014). Behind the glass: Driver challenges and opportunities for AR automotive applications. Proceedings of the IEEE, 102 (2):124--136.Google ScholarCross Ref
- Swan II, J. E., Jones, A., Kolstad, E., Livingston, M. A., and Smallman, H. S. (2007). Egocentric depth judgments in optical, see-through augmented reality. IEEE Transactions on Visualization and Computer Graphics (TVCG) 13, 3, 429--442 Google ScholarDigital Library
- Campbell, F. W, Green, D., G. (1965). Optical and retinal factors affecting visual resolution. Journal of Physiology;181:576--93Google ScholarCross Ref
- Palmisano, S., Gillam, B., Govan, D., Allison, R., Harris, J. (2010). Stereoscopic Perception of Real Depths at Larger Distances, Journal of Vision 10: 1--16.Google ScholarCross Ref
- McKee S. P. Taylor D. G. (2010). The precision of binocular and monocular depth judgments in natural settings. Journal of Vision, 10(10):5, 1--13Google ScholarCross Ref
- Ellis, S. R. and Menges, B. M. (1998) Localization of Virtual Objects in the Near Visual Field, Human Factors, vol. 40, no. 3, pp. 415--431, Sept.1998Google ScholarCross Ref
- Jerome C. J. and. Witmer, B. G (2005) The Perception and Estimation of Egocentric Distance in Real and Augmented Reality Environments, submitted manuscript, US Army Research Institution.,Google Scholar
- Jones, J. A. Swan II, J. E. Singh, G. Kolstad, E. and Ellis, S. R. (2008). The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception. In Proceedings of the Symposium on Applied Perception in Graphics and Visualization (APGV 2008), pages 9--14 Google ScholarDigital Library
- Jones, J. A. Swan II, J. E. Singh, G. and Ellis, S. R (2011). Peripheral Visual Information and Its Effect on Distance Judgments in Virtual and Augmented Environments. Proceedings of the Symposium on Applied Perception in Graphics and Visualization, 29--35. Google ScholarDigital Library
- Kirkley, S. (2003). Augmented Reality Performance Assessment Battery (ARPAB), PhD dissertation, Instructional Systems Technology, Indiana University Google ScholarDigital Library
Index Terms
- Depth discrimination between augmented reality and real-world targets for vehicle head-up displays
Recommendations
Virtual Head-up Displays for Augmented Reality in Cars: a User Testing to Validate the Congruence
ECCE '16: Proceedings of the European Conference on Cognitive ErgonomicsHead-up displays (HUD) permit augmented reality (AR) information in cars. Simulation is a convenient way to design and evaluate the benefit of such innovation for the driver. For this purpose, we have developed a virtual HUD that we compare to real AR ...
Effect of Volumetric Displays on Depth Perception in Augmented Reality
AutomotiveUI '18: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular ApplicationsAugmented reality (AR) head-up displays (HUD) have previously been explored as a potential information delivery system for drivers. In driving scenarios, correct perception of virtual object distance assists with effective use of AR HUDs in safety-...
Exploring head-up augmented reality interfaces for crash warning systems
AutomotiveUI '13: Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular ApplicationsCrash warning systems are designed to help avoid vehicle accidents by notifying drivers of potential hazards. In typical crash warning systems, primary warning information is provided through visual, audible and/or haptic cues. In general, the use of ...
Comments