Skip to main content
Log in

X-Board: an egocentric adaptive AR assistant for perception in indoor environments

  • Original Original
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

Augmented reality (AR) has the potential to become an effective assistive technology for emergencies in the future. However, raw AR content can confuse users’ visual perception and occlude information in the physical world. In this research, we propose X-Board, an X-ray visualization-based AR assistant for perception in indoor environments. In accordance with its design principles, X-Board provides visual–spatial cues by means of a grid mesh corresponding to the occluding surface in front of the target object. Meanwhile, the X-Board interacts with the physical world in real time, improving the coherence between the virtual and real worlds. To ensure the appropriate allocation of the user’s visual resources, the user’s visual intention is recognized based on gaze data to realize an adaptive display feature. The results of the user evaluation show that X-Board can effectively improve the accuracy and speed of the perception and reduce the cognitive load on users; thus, the usability of X-Board is confirmed. With X-Board, users could effectively perceive the spatial positions of their comrades in an indoor occluded environment in our simulated perception scenario.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Data availability

All the data and materials in the current study are available from the corresponding author on reasonable request.

References

  • Alghofaili R, Sawahata Y, Huang H et al. (2019) Lost in style: gaze-driven adaptive aid for vr navigation. In: Proceedings of the 2019 CHI conference on human factors in computing systems, 1–12

  • Avery B, Sandor C, Thomas BH (2009) Improving spatial perception for augmented reality x-ray vision. In: 2009 IEEE virtual reality conference. IEEE, 79–82

  • Bane R, Hollerer T (2004) Interactive tools for virtual X-ray vision in mobile augmented reality. In: Third IEEE and ACM international symposium on mixed and augmented reality. IEEE, 231–239

  • Bangor A, Kortum PT, Miller JT (2008) An empirical evaluation of the system usability scale. Int J Hum–comput Interact 24(6):574–594

    Article  Google Scholar 

  • Barnum P, Sheikh Y, Datta A et al. (2009) Dynamic seethroughs: synthesizing hidden views of moving objects. In: 2009 8th IEEE international symposium on mixed and augmented reality. IEEE, 111–114

  • Dey A, Cunningham A, Sandor C (2010) Evaluating depth perception of photorealistic mixed reality visualizations for occluded objects in outdoor environments. In: Proceedings of the 17th ACM symposium on virtual reality software and technology, 211–218

  • Dey A, Jarvis G, Sandor C et al. (2011) An evaluation of augmented reality x-ray vision for outdoor navigation. Virtual Real Society of Japan

  • Diaz C, Walker M, Szafir DA et al. (2017) Designing for depth perceptions in augmented reality. In: 2017 IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, 111–122

  • Dominic J, Robb A (2020) Exploring effects of screen-fixed and world-fixed annotation on navigation in virtual reality. In: 2020 IEEE conference on virtual reality and 3d user interfaces (VR). IEEE, 607–615

  • Endsley MR (1995) Toward a theory of situation awareness in dynamic systems. Hum Factors 37(1):32–64

    Article  Google Scholar 

  • Endsley MR (2017) Direct measurement of situation awareness: Validity and use of SAGAT. Situational awareness. Routledge, London, pp 129–156

    Book  Google Scholar 

  • Endsley MR (2021) A systematic review and meta-analysis of direct objective measures of situation awareness: a comparison of SAGAT and SPAM. Hum Factors 63(1):124–150

    Article  Google Scholar 

  • Erat O, Isop WA, Kalkofen D et al (2018) Drone-augmented human vision: exocentric control for drones exploring hidden areas. IEEE Trans Visual Comput Graph 24(4):1437–1446

    Article  Google Scholar 

  • Eren M T, Cansoy M, Balcisoy S (2013) Multi-view augmented reality for underground exploration. In: 2013 IEEE virtual reality (VR). IEEE, 117–118

  • Furmanski C, Azuma R, Daily M (2002) Augmented-reality visualizations guided by cognition: perceptual heuristics for combining visible and obscured information. In: Proceedings. International symposium on mixed and augmented reality. IEEE, 215–320

  • Gagnon HC, Rosales CS, Mileris R et al (2021) Estimating distances in action space in augmented reality. ACM Trans Appl Percept (TAP) 18(2):1–16

    Article  Google Scholar 

  • Gebhardt C, Hecox B, van Opheusden B et al. (2019) Learning cooperative personalized policies from gaze data. In: Proceedings of the 32nd annual ACM symposium on user interface software and technology, 197–208

  • Gruenefeld U, Prädel L, Heuten W (2019) Locating nearby physical objects in augmented reality. In: Proceedings of the 18th international conference on mobile and ubiquitous multimedia, 1–10

  • Gruenefeld U, Brück Y, Boll S (2020) Behind the scenes: comparing X-ray visualization techniques in head-mounted optical see-through augmented reality. In: 19th international conference on mobile and ubiquitous multimedia, 179–185

  • Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv Psychol North-Holl 52:139–183

    Article  Google Scholar 

  • Hertel J, Steinicke F (2021) Augmented reality for maritime navigation assistance-egocentric depth perception in large distance outdoor environments. In: 2021 IEEE virtual reality and 3D user interfaces (VR). IEEE, 122–130

  • Kalkofen D, Sandor C, White S et al (2011) Visualization techniques for augmented reality. Handbook of augmented reality. Springer, New York, pp 65–98

    Book  Google Scholar 

  • Kytö M, Mäkinen A, Häkkinen J et al (2013) Improving relative depth judgments in augmented reality with auxiliary augmentations. ACM Trans Appl Percept (TAP) 10(1):1–21

    Article  Google Scholar 

  • Lilija K, Pohl H, Boring S et al. (2019) Augmented reality views for occluded interaction. In: Proceedings of the 2019 CHI conference on human factors in computing systems, 1–12

  • Lindlbauer D, Feit AM, Hilliges O (2019) Context-aware online adaptation of mixed reality interfaces. In: Proceedings of the 32nd annual ACM symposium on user interface software and technology, 147–160

  • Liu JM, Narasimham G, Stefanucci JK et al. (2020) Distance perception in modern mobile augmented reality. In: 2020 IEEE conference on virtual reality and 3D user interfaces abstracts and workshops (VRW). IEEE, 196–200

  • Livingston MA, Ai Z, Karsch K et al (2011) User interface design for military AR applications. Virtual Real 15(2–3):175–184

    Article  Google Scholar 

  • Livingston MA, Swan JE, Gabbard JL et al. (2003) Resolving multiple occluded layers in augmented reality. In: The second IEEE and ACM international symposium on mixed and augmented reality, 2003. Proceedings. IEEE, 56–65

  • Livingston MA, Ai Z, Swan JE et al. (2009) Indoor vs. outdoor depth perception for mobile augmented reality. In: 2009 IEEE virtual reality conference. IEEE, 55–62

  • Lu F, Davari S, Lisle L et al. (2020) Glanceable ar: evaluating information access methods for head-worn augmented reality. In: 2020 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE, 930–939

  • Minaskan N, Pagani A, Dormoy CA et al. (2021) A study of human-machine teaming for single pilot operation with augmented reality. In: 2021 IEEE international symposium on mixed and augmented reality adjunct (ISMAR-adjunct). IEEE Computer Society, 397–402

  • Nilsson S, Johansson B, Jonsson A (2009) Using AR to support cross-organisational collaboration in dynamic tasks. In: 2009 8th IEEE international symposium on mixed and augmented reality. IEEE, 3–12

  • Osmers N, Prilla M. Getting out of out of sight: evaluation of AR mechanisms for awareness and orientation support in occluded multi-room settings. In: Proceedings of the 2020 CHI conference on human factors in computing systems, 1–11

  • Pascale MT, Sanderson P, Liu D et al (2019) The impact of head-worn displays on strategic alarm management and situation awareness. Hum Factors 61(4):537–563

    Article  Google Scholar 

  • Peillard E, Argelaguet F, Normand JM et al. (2019) Studying exocentric distance perception in optical see-through augmented reality. In: 2019 IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, 115–122

  • Pfeuffer K, Abdrabou Y, Esteves A et al (2021) ARtention: a design space for gaze-adaptive user interfaces in augmented reality. Comput Graph 95:1–12

    Article  Google Scholar 

  • Ping J, Liu Y, Weng D. Comparison in depth perception between virtual reality and augmented reality systems. In: 2019 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE, 1124–1125

  • Sandor C, Cunningham A, Dey A, et al. (2010) An augmented reality X-ray system based on visual saliency. In: 2010 IEEE international symposium on mixed and augmented reality. IEEE, 27–36

  • Sharma A, Nazir S, Ernstsen J (2019) Situation awareness information requirements for maritime navigation: a goal directed task analysis. Saf Sci 120:745–752

    Article  Google Scholar 

  • Sidenmark L, Gellersen H (2019) Eye&head: synergetic eye and head movement for gaze pointing and selection. In: Proceedings of the 32nd annual ACM symposium on user interface software and technology, 1161–1174

  • Swan JE, Singh G, Ellis SR (2015) Matching and reaching depth judgments with real and augmented reality targets. IEEE Trans Visual Comput Graph 21(11):1289–1298

    Article  Google Scholar 

  • Tsuda T, Yamamoto H, Kameda Y et al (2006) Visualization methods for outdoor see-through vision. IEICE Trans Inf Syst 89(6):1781–1789

    Article  Google Scholar 

  • Uratani K, Machida T, Kiyokawa K et al. (2005) A study of depth visualization techniques for virtual annotations in augmented reality. In: IEEE Proceedings. VR 2005. Virtual reality. IEEE, 295–296

  • Vaziri K, Bondy M, Bui A, et al. (2021) Egocentric distance judgments in full-cue video-see-through VR conditions are No better than distance judgments to targets in a void. In: 2021 IEEE virtual reality and 3D user interfaces (VR). IEEE, 1–9

  • Winter S, Tomko M, Vasardani M et al (2019) Infrastructure-independent indoor localization and navigation. ACM Comput Surv (CSUR) 52(3):1–24

    Article  Google Scholar 

  • Wither J, Hollerer T (2005) Pictorial depth cues for outdoor augmented reality. In: Ninth IEEE international symposium on wearable computers (ISWC’05). IEEE, 92–99

  • Zhang J, Xia X, Liu R et al (2021) Enhancing human indoor cognitive map development and wayfinding performance with immersive augmented reality-based navigation systems. Adv Eng Inform 50:101432

    Article  Google Scholar 

  • Zollmann S, Langlotz T, Grasset R et al (2020) Visualization techniques in augmented reality: a taxonomy, methods and patterns. IEEE Trans Visual Comput Graph 27(9):3808–3825

    Article  Google Scholar 

  • Zollmann S, Kalkofen D, Mendez E et al. (2010) Image-based ghostings for single layer occlusions in augmented reality. In: 2010 IEEE international symposium on mixed and augmented reality. IEEE, 19–26

  • Zollmann S, Grasset R, Reitmayr G et al. (2014) Image-based X-ray visualization techniques for spatial understanding in outdoor augmented reality. In: Proceedings of the 26th Australian computer–human interaction conference on designing futures: the future of design, 194–203

Download references

Acknowledgements

The authors wish to thank the anonymous reviewers for their constructive comments and the user evaluation participants for their time. This work was supported by the Pre-research Project of the 14th Five-Year Plan (Grant Number: 50904040201).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhenning Zhang.

Ethics declarations

Conflict of interest

All authors of the paper make the following statement: No potential conflict of interest is reported by the authors. For research involving human participants, we collected data about their operations in every experimental environment and obtained their consent to use these data for academic purposes.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, Z., Pan, Z., Li, W. et al. X-Board: an egocentric adaptive AR assistant for perception in indoor environments. Virtual Reality 27, 1327–1343 (2023). https://doi.org/10.1007/s10055-022-00742-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-022-00742-3

Keywords

Navigation