Abstract
Physically-based approaches are increasingly used in a wide field of computer graphics. By that, modern graphic engines can provide a realistic output using physical correct values instead of an analytical approximation. Such applications apply the final lighting on a geometry buffer to reduce the complexity. Using this approach for Mediated Reality applications, some changes have to be made in order to fuse the real with the virtual world. In this paper, we present an approach with a focusing on the extraction of real world environment information and saving them directly to the geometry buffer. Therefore, we introduce a solution using spatial geometry to integrate the real world into the virtual environment. Hereby, the approach is usable in real-time and allows for visual interaction between virtual and real world objects. Moreover, a manipulation of the real world is easily possible.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Burley, B.: Practical physically based shading in film and game production. In: Physically-based shading at disney, part of ACM SIGGRAPH 2012 Course (2012)
Dachsbacher, C., Stamminger, M.: Reflective shadow maps. In: Proceedings of the 2005 Symposium on Interactive 3D Graphics and Games, pp. 203–231 (2005)
Debevec, P.: Rendering synthetic objects into real scenes. In: Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques - SIGGRAPH 1998, pp. 189–198. Association for Computing Machinery (ACM) (1998). http://dx.doi.org/10.1145/280814.280864
Fournier, A., Gunawan, A.S., Romanzin, C.: Common illumination between real and computer generated scenes. Technical report Vancouver, BC, Canada (1992)
Franke, T.A.: Delta light propagation volumes for mixed reality. In: 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 125–132. Institute of Electrical and Electronics Engineers (IEEE), October 2013. http://dx.doi.org/10.1109/ISMAR.2013.6671772
Franke, T.A.: Delta voxel cone tracing. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 39–44. Institute of Electrical and Electronics Engineers (IEEE), September 2014. http://dx.doi.org/10.1109/ISMAR.2014.6948407
Gibson, S., Cook, J., Howard, T., Hubbold, R.: Rapid Shadow Generation in Real-World Lighting Environments. In: Dutre, P., Suykens, F., Christensen, P.H., Cohen-Or, D. (eds.) Eurographics Workshop on Rendering. The Eurographics Association (2003)
Gibson, S., Murta, A.: Interactive rendering with real-world Illumination. In: Pèroche, B., Rushmeier, H. (eds.) Rendering Techniques. Eurographics Workshop on Rendering, pp. 365–376. Springer, Vienna (2000). https://doi.org/10.1007%2F978-3-7091-6303-0_33
Gruber, L., Richter-Trummer, T., Schmalstieg, D.: Real-time photometric registration from arbitrary geometry. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 119–128. Institute of Electrical and Electronics Engineers (IEEE), November 2012. http://dx.doi.org/10.1109/ISMAR.2012.6402548
Gruber, L., Ventura, J., Schmalstieg, D.: Image-space illumination for augmented reality in dynamic environments. In: 2015 IEEE Virtual Reality (VR), pp. 127–134. Institute of Electrical and Electronics Engineers (IEEE), March 2015. http://dx.doi.org/10.1109/VR.2015.7223334
Han, Y., Lee, J.Y., Kweon, I.S.: High quality shape from a single RGB-D image under uncalibrated natural illumination. In: 2013 IEEE International Conference on Computer Vision, pp. 1617–1624. Institute of Electrical and Electronics Engineers (IEEE), December 2013. http://dx.doi.org/10.1109/ICCV.2013.204
Heitz, E., Dupuy, J., Hill, S., Neubelt, D.: Real-time polygonal-light shading with linearly transformed cosines. ACM Trans. Graph. 35(4), 41:1–41:8 (2016). http://doi.acm.org/10.1145/2897824.2925895
Jachnik, J., Newcombe, R.A., Davison, A.J.: Real-time surface light-field capture for augmentation of planar specular surfaces. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 91–97. Institute of Electrical and Electronics Engineers (IEEE), November 2012. http://dx.doi.org/10.1109/ISMAR.2012.6402544
Kán, P., Kaufmann, H.: Differential progressive path tracing for high-quality previsualization and relighting in augmented reality. In: Bebis, G., Boyle, R., Parvin, B., Koracin, D., Li, B., Porikli, F., Zordan, V., Klosowski, J., Coquillart, S., Luo, X., Chen, M., Gotz, D. (eds.) ISVC 2013. LNCS, vol. 8034, pp. 328–338. Springer, Heidelberg (2013). doi:10.1007/978-3-642-41939-3_32
Karis, B.: Real shading in unreal engine 4. In: ACM SIGGRAPH 2013 Course: Physically Based Shading in Theory and Practice (2013)
Knecht, M., Traxler, C., Mattausch, O., Purgathofer, W., Wimmer, M.: Differential instant radiosity for mixed reality. In: 2010 IEEE International Symposium on Mixed and Augmented Reality. Institute of Electrical and Electronics Engineers (IEEE), October 2010. https://doi.org/10.1109/ISMAR.2010.5643556
Knecht, M., Traxler, C., Mattausch, O., Wimmer, M.: Reciprocal shading for mixed reality. Comput. Graph. 36(7), 846–856 (2012). http://dx.doi.org/10.1016/j.cag.2012.04.013
Knecht, M., Traxler, C., Winklhofer, C., Wimmer, M.: Reflective and refractive objects for mixed reality. IEEE Trans. Visual. Comput. Graph. 19(4), 576–582 (2013). http://doi.acm.org/10.1145/2897824.2925895
Lagarde, S., Rousiers, C.D.: Moving frostbite to physically based rendering. In: ACM SIGGRAPH 2014 Course: Physically Based Shading in Theory and Practice (2014)
Lensing, P., Broll, W.: Fusing the real and the virtual: a depth-camera based approach to mixed reality. In: 2011 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 261–262. Institute of Electrical and Electronics Engineers (IEEE), October 2011. http://dx.doi.org/10.1109/ISMAR.2011.6143892
Lensing, P., Broll, W.: Instant indirect illumination for dynamic mixed reality scenes. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 109–118. Institute of Electrical and Electronics Engineers (IEEE), November 2012. http://dx.doi.org/10.1109/ISMAR.2012.6402547
Lensing, P., Broll, W.: LightSkin: Real-time global illumination for virtual and mixed reality. In: Proceedings of Joint Virtual Reality Conference of EGVE - EuroVR, pp. 17–24. The Eurographics Association (2013). http://dx.doi.org/10.2312/EGVE.JVRC13.017-024
Lombardi, S., Nishino, K.: Reflectance and natural illumination from a single image. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7577, pp. 582–595. Springer, Heidelberg (2012). doi:10.1007/978-3-642-33783-3_42
Mann, S.: Mediated Reality. Technical report 260, M.I.T. Media Lab Perceptual Computing Section, Cambridge, Massachusetts (1994)
Mann, S.: Mediated reality. Linux J. 1999(59es), 5 (1999). http://dl.acm.org/citation.cfm?id=327697.327702
Morgand, A., Tamaazousti, M., Bartoli, A.: An empirical model for specularity prediction with application to dynamic retexturing. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 44–53. Institute of Electrical and Electronics Engineers (IEEE), September 2016. http://doi.org/10.1109/ISMAR.2016.13
Newcombe, R.A., Fox, D., Seitz, S.M.: DynamicFusion: reconstruction and tracking of non-rigid scenes in real-time. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 343–352. Institute of Electrical and Electronics Engineers (IEEE), June 2015. http://dx.doi.org/10.1109/CVPR.2015.7298631
Richter-Trummer, T., Kalkofen, D., Park, J., Schmalstieg, D.: Instant mixed reality lighting from casual scanning. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 27–36. Institute of Electrical and Electronics Engineers (IEEE), September 2016. http://doi.org/10.1109/ISMAR.2016.18
Rohmer, K., Buschel, W., Dachselt, R., Grosch, T.: Interactive near-field illumination for photorealistic augmented reality on mobile devices. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 29–38. Institute of Electrical and Electronics Engineers (IEEE), September 2014. http://dx.doi.org/10.1109/ISMAR.2014.6948406
Rohmer, K., Grosch, T.: Tiled frustum culling for differential rendering on mobile devices. In: 2015 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 37–42. Institute of Electrical and Electronics Engineers (IEEE), September 2015. http://dx.doi.org/10.1109/ISMAR.2015.13
Salas-Moreno, R.F., Glocken, B., Kelly, P.H.J., Davison, A.J.: Dense planar SLAM. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 157–164. Institute of Electrical and Electronics Engineers (IEEE), September 2014. http://dx.doi.org/10.1109/ISMAR.2014.6948422
Schwandt, T., Broll, W.: A single camera image based approach for glossy reflections in mixed reality applications. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 37–43. Institute of Electrical and Electronics Engineers (IEEE), September 2016. http://doi.org/10.1109/ISMAR.2016.12
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Schwandt, T., Broll, W. (2017). Differential G-Buffer Rendering for Mediated Reality Applications. In: De Paolis, L., Bourdot, P., Mongelli, A. (eds) Augmented Reality, Virtual Reality, and Computer Graphics. AVR 2017. Lecture Notes in Computer Science(), vol 10325. Springer, Cham. https://doi.org/10.1007/978-3-319-60928-7_30
Download citation
DOI: https://doi.org/10.1007/978-3-319-60928-7_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-60927-0
Online ISBN: 978-3-319-60928-7
eBook Packages: Computer ScienceComputer Science (R0)