Skip to main content
Log in

Augmented reality using personal projection and retroreflection

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

The support of realistic and flexible training simulations for military, law enforcement, emergency response, and other domains has been an important motivator for the development of augmented reality technology. An important vision for achieving this goal has been the creation of a versatile “stage” for physical, emotional, and cognitive training that combines virtual characters and environments with real world elements, such as furniture and props. This paper presents REFLCT, a mixed reality projection framework that couples a near-axis personal projector design with tracking and novel retroreflective props and surfaces. REFLCT provides multiple users with personalized, perspective-correct imagery that is uniquely composited for each user directly into and onto a surrounding environment, without any optics positioned in front of the user’s eyes or face. These characteristics facilitate team training experiences which allow users to easily interact with their teammates while wearing their standard issue gear. REFLCT can present virtual humans who can make deictic gestures and establish eye contact without the geometric ambiguity of a typical projection display. It can also display perspective-correct scenes that require a realistic approach for detecting and communicating potential threats between multiple users in disparate locations. In addition to training applications, this display system appears to be well matched with other user interface and application domains, such as asymmetric collaborative workspaces and personal information guides.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Agrawala M, Beers AC, McDowall I, Fröhlich B, Bolas M, Hanrahan P (1997) The two-user responsive workbench: support for collaboration through individual views of a shared space. ACM SIGGRAPH 327–332. doi:10.1145/258734.258875

  2. Baker H, Li Z. (2009) Camera and projector arrays for immersive 3D video. In: IMMERSCOM’09: International conference on immersive telecommunications, pp 1–6

  3. Bolas M, Krum DM (2010) Augmented reality applications and user interfaces using head-coupled near-axis personal projectors with novel retroreflective props and surfaces. Pervasive 2010 Ubiprojection Workshop

  4. Fergason JL (1997) Optical system for a head mounted display using a retro-reflector and method of displaying an image, US patent 5621572

  5. Hua H, Gao C, Brown L, Biocca F, Rolland JP (2002) Design of an ultra-light head-mounted projective display (HMPD) and its applications in augmented collaborative environments. In Proceedings of SPIE 4660:492–497

  6. Hua H, Gao C (2005) A polarized head-mounted projective display. IEEE International symposium on mixed and augmented reality (ISMAR), pp 32–35

  7. Inami M, Kawakami N, Sekiguchi D, Yanagida Y, Maeda T, Tachi S (2000) Visuo-haptic display using head-mounted projector, IEEE virtual reality, pp 233–240

  8. Karitsuka T, Sato K (2003) A wearable mixed reality with an on-board projector, IEEE International symposium on mixed and augmented reality (ISMAR), pp 321–322

  9. Koepnick S, Hoang R, Sgambati M, Coming D, Suma E, Sherman W (2010) RIST: radiological immersive survey training for two simultaneous users, computers & graphics (in press)

  10. Matusik W, Pfister H (2004) 3D TV: a scalable system for real-time acquisition, transmission, and autostereoscopic display of dynamic scenes, ACM SIGGRAPH, pp 814–824

  11. McFarlane DC, Wilder SM (2009) Interactive dirt: increasing mobile work performance with a wearable projector-camera system. ACM International conference on ubiquitous computing (Ubicomp’09), pp 205–214

  12. Microvision, Inc. (2009) Microvision’s PicoP display engine at heart of realistic game demo at Intel extreme masters tournament, Microvision, Inc. Press release, http://phx.corporate-ir.net/phoenix.zhtml?c=114723&p=irol-newsArticle&ID=1364520&highlight=

  13. Milgram P, Kishino F (1994) A taxonomy of mixed reality visual displays. Institute of electronics, information and communication engineers (IEICE). Trans inf Syst E77-D(12):1321–1329

  14. Muller P, Schmorrow D, Buscemi T (2008) The infantry immersion trainer: today’s holodeck, marine corps gazette, pp 14–18

  15. Olwal A, DiVerdi S, Candussi N, Rakkolainen I, Hollerer T (2006) An immaterial, dual-sided display system with 3D interaction, IEEE virtual reality, pp 279–280

  16. Pair J, Neumann U, Piepol D, Swartout B (2003) FlatWorld: combining Hollywood set de-sign techniques with VR. IEEE Comput Gr Appl 23(1):12–15

    Article  Google Scholar 

  17. Raskar R, Welch G, Low K, Bandyopadhyay D (2001) Shader lamps: animating real objects with image-based illumination. Eurographics rendering workshop

  18. Rolland JP, Biocca F, Hamza-Lup F, Ha Y, Martins R (2005) Development of head-mounted projection displays for distributed, collaborative, augmented reality applications. Presence: Teleoper. Virtual Environ 14(5):528–549

    Article  Google Scholar 

  19. Stolle H, Olaya J-C, Buschbeck S, Sahm H, Schwerdtner A (2008) Technical solutions for a full-resolution auto-stereoscopic 2D/3D display technology. In: Proceedings of SPIE, p 6803

  20. Taylor II RM, Hudson TC, Seeger A, Weber H, Juliano J, Helser AT (2001) VRPN: a device-independent, network-transparent VR peripheral system. ACM VRST, pp 55–61

Download references

Acknowledgments

The authors wish to thank John Hart for guidance with this project, as well as Thai Phan, Brad Newman, and David Nelson for numerous contributions. This work was funded by the US Army Research, Development, and Engineering Command (RDECOM) via an Institute for Creative Technologies Seedling grant. The content of the information does not necessarily reflect the position or the policy of the US Government, and no official endorsement should be inferred.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David M. Krum.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Krum, D.M., Suma, E.A. & Bolas, M. Augmented reality using personal projection and retroreflection. Pers Ubiquit Comput 16, 17–26 (2012). https://doi.org/10.1007/s00779-011-0374-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-011-0374-4

Keywords

Navigation