Skip to main content

Environmental Planning Using Augmented Reality

  • Chapter
  • First Online:
Handbook of Augmented Reality
  • 11k Accesses

Abstract

Before a large project is carried out, it is difficult to know the resulting view of the project accurately and realistically. Even though you invite specialists to look up blue prints or data, the results obtained cannot be perceived directly through this sense. Moreover, a large project affects environments greatly, it is necessary to develop a new powerful visualization-tool that assesses environmental impact accurately from the esthetics and ecology points of view before engineering is constructed. Clearly, this is of great importance for avoidance of negative effects on environment. In recent years, the virtual reality (VR) has been rather popular for project planning. It provides us with a new means for visual assessment. As known, virtual reality needs to use three-dimensional (3D) computer graphics to model and render virtual environments in real-time. This approach usually requires laborious modeling and expensive 3D graphic accelerator for fast rendering of more complicated scenes. The rendering quality and scene complexity are often limited because of the real-time constraint. Consequently, it is difficult to obtain a satisfactory solution. However, augmented reality [1, 2] method can overcome theĀ above limitations. Augmented Reality (AR) is a technology that incorporates theĀ rich information available in the real world into the virtual reality. The AR can be realized by overlaying 3D graphical objects with image without camera parameters pre-calibrated. Because of being a video-image approach, fewer virtual objects that will be fused into real environment need to be drawn, and so the cost of rendering is independent of the scene complexity. As a result, the AR system does not require specialized graphics accelerators. On the other hand, the amount of realism in the AR system depends on the quality of the input images. It is easy to make the AR systems more realistic than many VR systems. In the following, the main idea of the AR based on vision for environmental planning is presented.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ohta,Y, ā€œMixed Reality-merging real and virtual worlds,ā€ Tokyo: ohmsha/Springer-Verlag, 1999.

    Google ScholarĀ 

  2. Azuma, R.T, ā€œA survey of augmented reality,ā€ Presence: Teleoperators and Virtual Environments, Vol. 6, no. 4, 1997, pp. 355ā€“385.

    Google ScholarĀ 

  3. D. Lowe, ā€œDistinctive Image Features from Scale-Invariant Key-pointsā€, Intā€™l J. Computer Vision, Vol. 60, no. 2, November, 2004, pp. 91ā€“110.

    Google ScholarĀ 

  4. M. Fischler, and R. Bolles, ā€œRandom Sample Consenus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,ā€ CACM, vol. 24, no. 6, June, 1981, pp. 381ā€“395.

    Google ScholarĀ 

  5. Johnny Park, and Avinash C. Kak, ā€œ3D modeling of optically challenging objectsā€, IEEE Trans. On Visualization and Computer Graphics, vol. 14, no. 2, March, 2008, pp. 246ā€“262.

    Google ScholarĀ 

  6. Behringer R, Park J, and Sundareswaran V, ā€œModel-Based Visual Tracking for Outdoor Augmented Reality,ā€ Proceedings of the Intā€™ l Symp on Mixed and Augmented Reality, Germany, October, 2002, pp. 277.

    Google ScholarĀ 

  7. Drummond T, and Cipolla R, ā€œReal-Time Visual Tracking of Complex Structures,ā€ IEEE Trans on Pattern Analysis and Machine Intelligence, vol. 24, no. 7, July, 2002, pp. 932ā€“946.

    Google ScholarĀ 

  8. Hartely R, and Zisserman A, Multiple View Geometry in Computer Vision, Second ed., Cambridge University Press, 2004.

    Google ScholarĀ 

  9. Zhu Miao-liang, Yao Yuan, and Jiang Yun-liang, ā€œA Survey on Augmented Reality,ā€ Chinese Journal of Image and Graphics, vol. 9, no. 7, July, 2004, pp. 767ā€“774.

    Google ScholarĀ 

  10. Shi Qi, Wang Yong-tian, and Cheng Jing, ā€œVision-Based Algorithm for Augmented Reality Registration,ā€ Journal of Image and Graphics, vol. 7, no. 7, July, 2002, pp. 679ā€“683.

    Google ScholarĀ 

  11. M.S. Shahidan, N. Ibrahim, M.H.M. Zabil, and A. Yusof.Ā An implementation review of occlusion-based interaction in augmented reality environment. 2nd International Conference on Computing & Informatics, Kuala Lumpur, Malaysia, June, 2009, pp. 24ā€“25.

    Google ScholarĀ 

  12. Wloka M, and Anderson B. Resolving occlusions in augmented reality. In Symposium on Interactive 3D Graphics Proceedings,(New Your), August, 1995, pp. 5ā€“12.

    Google ScholarĀ 

  13. Jie Shen, and Haowu Liu, ā€œSolving occlusion problem in augmented reality,ā€ Chinese Journal of UEST, vol. 30, no. 3, March, 2001, pp. 236ā€“240.

    Google ScholarĀ 

  14. H. Wang, K. Sengupta, P. Kumar, and R. Sharma, ā€œOcclusion handling in augmented reality using background-foreground segmentation and projective geometry,ā€ Presence: Teleoperators & Virtual Environments, vol. 14, no. 3, June, 2005, pp. 264ā€“277.

    Google ScholarĀ 

  15. Zhu Jiejie, and Pan Zhigen, ā€œComputer Vision Based Occlusion Handling Algorithm for Video-Based Augmented Reality,ā€ Chinese Journal Of Computer-Aided Design & Computer Graphics, vol. 19, no. 12, December, 2007, pp. 1624ā€“1628.

    Google ScholarĀ 

  16. J.J. Koenderink, and A.J. van Doorn, ā€œAffine structure from Motion,ā€ J.Opt.Soc.Am.A, vol. 8, no. 2, February, 1991, pp. 377ā€“385.

    Google ScholarĀ 

  17. M. Seitz, and C.R. Dyer, ā€œComplete Scene Structure From Four Point Correspondences,ā€ Proc. Fifth Intā€™l Computer Vision, Cambridge, MA, June, 1995, pp. 330ā€“337.

    Google ScholarĀ 

  18. Hartley, R.I, ā€œIn defence of the 8-point algorithmā€, Proceedings of the fifth International Conference on Computer Vision, Cambridge, Massachusetts, USA, June, 1995, pp. 1064ā€“1070.

    Google ScholarĀ 

  19. T. Liu, A.W. Moore, A. Gray, and K. Yang, ā€œAn Investigation of Practical Approximate Nearest Neighbor Algorithmsā€, Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 2005, pp.Ā 825ā€“832.

    Google ScholarĀ 

  20. P.H.S. Torr, and D.W. Murray, ā€œThe Development and Comparison of Robust Methods for Estimating the Fundamental Matrixā€, International Journaxl of Computer Vision, vol. 24, no. 3, October, 1997, pp. 271ā€“300.

    Google ScholarĀ 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jie Shen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2011 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Shen, J. (2011). Environmental Planning Using Augmented Reality. In: Furht, B. (eds) Handbook of Augmented Reality. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-0064-6_22

Download citation

  • DOI: https://doi.org/10.1007/978-1-4614-0064-6_22

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4614-0063-9

  • Online ISBN: 978-1-4614-0064-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics