Abstract
Before a large project is carried out, it is difficult to know the resulting view of the project accurately and realistically. Even though you invite specialists to look up blue prints or data, the results obtained cannot be perceived directly through this sense. Moreover, a large project affects environments greatly, it is necessary to develop a new powerful visualization-tool that assesses environmental impact accurately from the esthetics and ecology points of view before engineering is constructed. Clearly, this is of great importance for avoidance of negative effects on environment. In recent years, the virtual reality (VR) has been rather popular for project planning. It provides us with a new means for visual assessment. As known, virtual reality needs to use three-dimensional (3D) computer graphics to model and render virtual environments in real-time. This approach usually requires laborious modeling and expensive 3D graphic accelerator for fast rendering of more complicated scenes. The rendering quality and scene complexity are often limited because of the real-time constraint. Consequently, it is difficult to obtain a satisfactory solution. However, augmented reality [1, 2] method can overcome theĀ above limitations. Augmented Reality (AR) is a technology that incorporates theĀ rich information available in the real world into the virtual reality. The AR can be realized by overlaying 3D graphical objects with image without camera parameters pre-calibrated. Because of being a video-image approach, fewer virtual objects that will be fused into real environment need to be drawn, and so the cost of rendering is independent of the scene complexity. As a result, the AR system does not require specialized graphics accelerators. On the other hand, the amount of realism in the AR system depends on the quality of the input images. It is easy to make the AR systems more realistic than many VR systems. In the following, the main idea of the AR based on vision for environmental planning is presented.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ohta,Y, āMixed Reality-merging real and virtual worlds,ā Tokyo: ohmsha/Springer-Verlag, 1999.
Azuma, R.T, āA survey of augmented reality,ā Presence: Teleoperators and Virtual Environments, Vol. 6, no. 4, 1997, pp. 355ā385.
D. Lowe, āDistinctive Image Features from Scale-Invariant Key-pointsā, Intāl J. Computer Vision, Vol. 60, no. 2, November, 2004, pp. 91ā110.
M. Fischler, and R. Bolles, āRandom Sample Consenus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,ā CACM, vol. 24, no. 6, June, 1981, pp. 381ā395.
Johnny Park, and Avinash C. Kak, ā3D modeling of optically challenging objectsā, IEEE Trans. On Visualization and Computer Graphics, vol. 14, no. 2, March, 2008, pp. 246ā262.
Behringer R, Park J, and Sundareswaran V, āModel-Based Visual Tracking for Outdoor Augmented Reality,ā Proceedings of the Intā l Symp on Mixed and Augmented Reality, Germany, October, 2002, pp. 277.
Drummond T, and Cipolla R, āReal-Time Visual Tracking of Complex Structures,ā IEEE Trans on Pattern Analysis and Machine Intelligence, vol. 24, no. 7, July, 2002, pp. 932ā946.
Hartely R, and Zisserman A, Multiple View Geometry in Computer Vision, Second ed., Cambridge University Press, 2004.
Zhu Miao-liang, Yao Yuan, and Jiang Yun-liang, āA Survey on Augmented Reality,ā Chinese Journal of Image and Graphics, vol. 9, no. 7, July, 2004, pp. 767ā774.
Shi Qi, Wang Yong-tian, and Cheng Jing, āVision-Based Algorithm for Augmented Reality Registration,ā Journal of Image and Graphics, vol. 7, no. 7, July, 2002, pp. 679ā683.
M.S. Shahidan, N. Ibrahim, M.H.M. Zabil, and A. Yusof.Ā An implementation review of occlusion-based interaction in augmented reality environment. 2nd International Conference on Computing & Informatics, Kuala Lumpur, Malaysia, June, 2009, pp. 24ā25.
Wloka M, and Anderson B. Resolving occlusions in augmented reality. In Symposium on Interactive 3D Graphics Proceedings,(New Your), August, 1995, pp. 5ā12.
Jie Shen, and Haowu Liu, āSolving occlusion problem in augmented reality,ā Chinese Journal of UEST, vol. 30, no. 3, March, 2001, pp. 236ā240.
H. Wang, K. Sengupta, P. Kumar, and R. Sharma, āOcclusion handling in augmented reality using background-foreground segmentation and projective geometry,ā Presence: Teleoperators & Virtual Environments, vol. 14, no. 3, June, 2005, pp. 264ā277.
Zhu Jiejie, and Pan Zhigen, āComputer Vision Based Occlusion Handling Algorithm for Video-Based Augmented Reality,ā Chinese Journal Of Computer-Aided Design & Computer Graphics, vol. 19, no. 12, December, 2007, pp. 1624ā1628.
J.J. Koenderink, and A.J. van Doorn, āAffine structure from Motion,ā J.Opt.Soc.Am.A, vol. 8, no. 2, February, 1991, pp. 377ā385.
M. Seitz, and C.R. Dyer, āComplete Scene Structure From Four Point Correspondences,ā Proc. Fifth Intāl Computer Vision, Cambridge, MA, June, 1995, pp. 330ā337.
Hartley, R.I, āIn defence of the 8-point algorithmā, Proceedings of the fifth International Conference on Computer Vision, Cambridge, Massachusetts, USA, June, 1995, pp. 1064ā1070.
T. Liu, A.W. Moore, A. Gray, and K. Yang, āAn Investigation of Practical Approximate Nearest Neighbor Algorithmsā, Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 2005, pp.Ā 825ā832.
P.H.S. Torr, and D.W. Murray, āThe Development and Comparison of Robust Methods for Estimating the Fundamental Matrixā, International Journaxl of Computer Vision, vol. 24, no. 3, October, 1997, pp. 271ā300.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2011 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Shen, J. (2011). Environmental Planning Using Augmented Reality. In: Furht, B. (eds) Handbook of Augmented Reality. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-0064-6_22
Download citation
DOI: https://doi.org/10.1007/978-1-4614-0064-6_22
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-0063-9
Online ISBN: 978-1-4614-0064-6
eBook Packages: Computer ScienceComputer Science (R0)