Abstract
A virtual object that is integrated into the real world in a perceptually coherent manner using the physical illumination information in the current environment is still under development. Several researchers investigated the problem producing a high-quality result; however, pre-computation and offline availability of resources were the essential assumption upon which the system relied. In this paper, we propose a novel and robust approach to identifying the incident light in the scene using the polarization properties of the light wave and using this information to produce a visually coherent augmented reality within a dynamic environment. This approach is part of a complete system which has three simultaneous components that run in real-time: (i) the detection of the incident light angle, (ii) the estimation of the reflected light, and (iii) the creation of the shading properties which are required to provide any virtual object with the detected lighting, reflected shadows, and adequate materials. Finally, the system performance is analyzed where our approach has reduced the overall computational cost.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Horn, B.K.: Obtaining shape from shading information. Psychol. Comput. Vis. 115–55 (1975)
Zerner, M.C.: Semiempirical Molecular Orbital Methods. Reviews in Computational Chemistry, pp. 313–365. Wiley, Hoboken (1991)
Debevec, P.E., Andmalik, J.: Recovering high dynamic range radiance maps from photographs. In: SIGGRAPH97, pp. 369–378 (1997)
Keller A.: Instant radiosity. In: Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques. ACM Press/Addison-Wesley Publishing Co., pp. 49–56 (1997)
Debevec, P.: Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1998, pp. 189–198. ACM, New York (1998). ISBN 0-89791-999-8
Chen, H., Wolff, L.B.: Polarization phase-based method for material classification in computer vision. Int. J. Comput. Vision 28(1), 73–83 (1998)
Ramamoorthi, R., Hanrahan, P.: An efficient representation for irradiance environment maps. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, pp. 497–500 (2001)
Debevec, P.: Image-based lighting. IEEE Comput. Graphics Appl. 22(2), 26–34 (2002)
Brom, J.M., Rioux, F.: Polarized light and quantum mechanics: an optical analog of the Stern-Gerlach experiment. Chem. Educ. 7(4), 200–204 (2002)
Knecht, M., Traxler, C., Mattausch, O., Purgathofer, W., Wimmer, M.: Differential instant radiosity for mixed reality. In: 9th IEEE International Symposium Mixed and Augmented Reality (ISMAR), pp. 99–107 (2010)
Kán, P., Kaufmann, H.: High-quality reflections, refractions, and caustics in augmented reality and their contribution to visual coherence. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 99–108 (2012)
Gruber, L., Richter-Trummer, T., Schmalstieg, D.: Real-time photometric registration from arbitrary geometry. In: IEEE International Symposium Mixed and Augmented Reality (ISMAR), pp. 119–128 (2012)
Jie, B.K.: Physics of Quantum Key Distribution, CS2107-Semester IV, 107 (2014–2015)
Gruber, L., Langlotz, T., Sen, P., Hoherer, T., Schmalstieg, D.: Efficient and robust radiance transfer for probeless photorealistic augmented reality. In: IEEE Virtual Reality (VR), pp. 15–20 (2014)
Ngo Thanh, T., Nagahara, H., Taniguchi, R.I.: Shape and light directions from shading and polarization. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2310–2318 (2015)
Gruber, L., Ventura, J., Schmalstieg, D.: Image-space illumination for augmented reality in dynamic environments. In: Virtual Reality (VR), pp. 127–134 (2015)
Rhee, T., Petikam, L., Allen, B., Chalmers, A.: Mr360: mixed reality rendering for 360 panoramic videos. IEEE Trans. Visual. Comput. Graphics 4, 1379–1388 (2017)
Fan, C.L., Lee, J., Lo, W.C., Huang, C.Y., Chen, K.T., Hsu, C.H.: Fixation prediction for 360 video streaming in head-mounted virtual reality. In: Proceedings of the 27th Workshop on Network and Operating Systems Support for Digital Audio and Video, pp. 67–72 (2017)
Shen, L., Zhao, Y., Peng, Q., Chan, J.C., Kong, S.G.: An iterative image dehazing method with polarization. IEEE Trans. Multimedia (2018)
Alhakamy, A., Tuceryan, M.: AR360: dynamic illumination for augmented reality with real-time interaction. In: 2019 IEEE 2nd International Conference on Information and Computer Technologies ICICT, pp. 170–175 (2019)
Alhakamy, A., Tuceryan, M.: CubeMap360: interactive global illumination for augmented reality in dynamic environment. In: IEEE SoutheastCon (2019). Accepted and Presented
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Alhakamy, A., Tuceryan, M. (2019). Polarization-Based Illumination Detection for Coherent Augmented Reality Scene Rendering in Dynamic Environments. In: Gavrilova, M., Chang, J., Thalmann, N., Hitzer, E., Ishikawa, H. (eds) Advances in Computer Graphics. CGI 2019. Lecture Notes in Computer Science(), vol 11542. Springer, Cham. https://doi.org/10.1007/978-3-030-22514-8_1
Download citation
DOI: https://doi.org/10.1007/978-3-030-22514-8_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22513-1
Online ISBN: 978-3-030-22514-8
eBook Packages: Computer ScienceComputer Science (R0)