Skip to main content
Log in

IMAF: in situ indoor modeling and annotation framework on mobile phones

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

The widespread use of smart phones with GPS and orientation sensors opens up new possibilities for location-based annotations in outdoor environments. However, a completely different approach is required for indoors. In this study, we introduce IMAF, a novel indoor modeling and annotation framework on a mobile phone. The framework produces a 3D room model in situ with five selections from user without prior knowledge on actual geometry distance or additional apparatus. Using the framework, non-experts can easily capture room dimensions and annotate locations and objects within the room for linking virtual information to the real space represented by an approximated box. For registering 3D room model to the real space, an hybrid method of visual tracking and device sensors obtains accurate orientation tracking result and still achieves interactive frame-rates for real-time applications on a mobile phone. Once the created room model is registered to the real space, user-generated annotations can be attached and viewed in AR and VR modes. Finally, the framework supports object-based space to space registration for viewing and creating annotations from different views other than the view that generated the annotations. The performance of the proposed framework is demonstrated with achieved model accuracy, modeling time, stability of visual tracking and satisfaction of annotation. In the last section, we present two exemplar applications built on IMAF.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  1. Davison A (2003) Real-time simultaneous localisation and mapping with a single camera. In: Proceedings of ninth IEEE international conference on computer vision, 2003, pp 1403–1410. doi:10.1109/ICCV.2003.1238654

  2. DiVerdi S, Winter J, Höllerer T (2008) Envisor: online environment map construction for mixed reality. In: Virtual reality conference (VR), 2008 IEEE, pp 19–26. doi:10.1109/VR.2008.4480745

  3. Freeman R, Steed A (2006) Interactive modelling and tracking for mixed and augmented reality. In: Proceedings of ACM virtual reality software and technology, pp 61–64. doi:10.1145/1180495.1180508

  4. Hengel A, Hill R, Ward B, Dick A (2009) In situ image-based modeling. In: 8th IEEE international symposium on mixed and augmented reality (ISMAR 2009), pp 107–110. doi:10.1109/ISMAR.2009.5336482

  5. Jang Y, Woo W (2011) Stroke-based semi-automatic region of interest detection algorithm for in situ painting recognition. In: The 14th international conference on human-computer interaction (HCII 2011), pp 167–176. doi:10.1007/978-3-642-22024-1

  6. Kim H, Woo W (2010) Real and virtual worlds linkage through cloud-mobile convergence. In: 2010 Cloud-Mobile Convergence for virtual reality workshop (CMCVR), pp 10–13. doi:10.1109/CMCVR.2010.5560607

  7. Kim H, Yoon H, Choi A, Baek W, Lee I, Kim D, Woo W (2011) Data markup representation for mixed reality contents. In: The international AR standard meeting 2011

  8. Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In: 6th IEEE and ACM international symposium on mixed and augmented reality, 2007 (ISMAR 2007), pp 13–16. doi:10.1109/ISMAR.2007.4538852

  9. Klein G, Murray D (2009) Parallel tracking and mapping on a camera phone. In: 8th IEEE international symposium on mixed and augmented reality, 2009 (ISMAR 2009), pp 83–86. doi:10.1109/ISMAR.2009.5336495

  10. Langlotz T, Mooslechner S, Zollmann S, Degendorfer C, Reitmayr G, Schmalstieg D (2011) Sketching up the world: in situ authoring for mobile augmented reality. Pers Ubiquit Comput Online First.

  11. Lovegrove S, Davison A (2010) Real-time spherical mosaicing using whole image alignment. In: Proceedings of the 11th European conference on computer vision conference on computer vision: part III, pp 73–86. doi:10.1007/978-3-642-15558-1

  12. Metaio. Junaio (2011) http://www.junaio.com. Accessed 20 Sept 2011

  13. Microsoft. Photosynth iPhone (2011) http://photosynth.net/capture.asp. Accessed 20 Sept 2011

  14. Mobilizy. Wikitude (2009) http://www.wikitude.org. Accessed 20 Sept 2011

  15. Montiel J, Davison A (2006) A visual compass based on SLAM. In: Proceedings 2006 IEEE international conference on robotics and automation, 2006 (ICRA 2006), pp 1917–1922. doi:10.1109/ROBOT.2006.1641986

  16. Schall G, Wagner D, Reitmayr G, Taichmann E, Wieser M, Schmalstieg D, Hofmann-Wellenhof B (2009) Global pose estimation using multi-sensor fusion for outdoor augmented reality. In: 8th IEEE international symposium on mixed and augmented reality, 2009 (ISMAR 2009), pp 153–162. doi:10.1109/ISMAR.2009.5336489

  17. Sensopia. MagicPlan iPhone (2011) http://wwe.sensopia.com/english/support.htm. Accessed 20 Sept 2011

  18. Simon G (2010) In situ 3D sketching using a video camera as an interaction and tracking device. In: 31st annual conference of the European association for computer graphics—Eurographics 2010

  19. sprxmobile. Layar Reality Browse (2010) http://www.layar.com/. Accessed 20 Sept 2011

  20. Wagner D, Mulloni A, Langlotz T, Schmalstieg D (2010) Real-time panoramic mapping and tracking on mobile phones. In: IEEE Virtual reality conference (VR), 2010, pp 211–218. doi:10.1109/VR.2010.5444786

  21. Wither J, DiVerd S, Höllerer T (2006) Using aerial photographs for improved mobile AR annotation. In: 5th IEEE/ACM international symposium on mixed and augmented reality, 2006 (ISMAR 2006), pp 159–162. doi:10.1109/ISMAR.2006.297808

  22. Wither J, Coffin C, Ventura J, Höllerer T (2008) Fast annotation and modeling with a single-point laser range finder. In: 7th IEEE/ACM international symposium on mixed and augmented reality, 2008 (ISMAR 2008), pp 65–68. doi:10.1109/ISMAR.2008.4637326

Download references

Acknowledgments

This research was supported by KOCCA/MCST CT R&D Program 2011 and NRF/MEST Global Frontier R&D Program on “Human-centered Interaction for Coexistence” (NRF-M1AX A003-2011-0028361).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Woontack Woo.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Kim, H., Reitmayr, G. & Woo, W. IMAF: in situ indoor modeling and annotation framework on mobile phones. Pers Ubiquit Comput 17, 571–582 (2013). https://doi.org/10.1007/s00779-012-0516-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-012-0516-3

Keywords

Navigation