skip to main content
10.1145/3307334.3326098acmconferencesArticle/Chapter ViewAbstractPublication PagesmobisysConference Proceedingsconference-collections
research-article

GLEAM: An Illumination Estimation Framework for Real-time Photorealistic Augmented Reality on Mobile Devices

Published: 12 June 2019 Publication History

Abstract

Mixed reality mobile platforms attempt to co-locate virtual scenes with physical environments, towards creating immersive user experiences. However, to create visual harmony between virtual and physical spaces, the virtual scene must be accurately illuminated with realistic lighting that matches the physical environment. To this end, we design GLEAM, a framework that provides robust illumination estimation in real-time by integrating physical light-probe estimation with current mobile AR systems. GLEAM visually observes reflective objects to compose a realistic estimation of physical lighting. Optionally, GLEAM can network multiple devices to sense illumination from different viewpoints and compose a richer estimation to enhance realism and fidelity. Using GLEAM, AR developers gain the freedom to use a wide range of materials, which is currently limited by the unrealistic appearance of materials that need accurate illumination, such as liquids, glass, and smooth metals. Our controlled environment user studies across 30 participants reveal the effectiveness of GLEAM in providing robust and adaptive illumination estimation over commercial status quo solutions, such as pre-baked directional lighting and ARKit 2.0 illumination estimation. Our benchmarks reveal the need for situation driven tradeoffs to optimize for quality factors in situations requiring freshness over quality and vice-versa. Optimizing for different quality factors in different situations, GLEAM can update scene illumination as fast as 30ms by sacrificing richness and fidelity in highly dynamic scenes, or prioritize quality by allowing an update interval as high as 400ms in scenes that require high-fidelity estimation.

References

[1]
Francesco Banterle, Marco Callieri, Matteo Dellepiane, Massimiliano Corsini, Fabio Pellacini, and Roberto Scopigno. 2013. EnvyDepth: An interface for recovering local natural illumination from environment maps. In Computer Graphics Forum, Vol. 32. Wiley Online Library, 411--420.
[2]
M Buerli and S Misslinger. 2017. Introducing ARKit-Augmented Reality for iOS. In Apple Worldwide Developers Conference (WWDC'17) . 1--187.
[3]
Massimiliano Corsini, Marco Callieri, and Paolo Cignoni. 2008. Stereo light probe. In Computer Graphics Forum, Vol. 27. Wiley Online Library, 291--300.
[4]
Paul Debevec. 1998. Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography. In Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '98). ACM, New York, NY, USA, 10.
[5]
Paul Debevec, Paul Graham, Jay Busch, and Mark Bolas. 2012. A single-shot light probe. In ACM SIGGRAPH 2012 Talks. ACM, 10.
[6]
Vuforia Developer. {n. d.}. SDK, Unity extension Vuforia--7.1 (2018).
[7]
Unity Game Engine. 2018. Unity game engine-official site. Online{Cited: August 6, 2018.} http://unity3d.com (2018), 1534--4320.
[8]
Y. Feng. 2008. Estimation of Light Source Environment for Illumination Consistency of Augmented Reality. In 2008 Congress on Image and Signal Processing, Vol. 3. 771--775.
[9]
Marc-André Gardner, Kalyan Sunkavalli, Ersin Yumer, Xiaohui Shen, Emiliano Gambaretto, Christian Gagné, and Jean-Franccois Lalonde. 2017. Learning to Predict Indoor Illumination from a Single Image. ACM Trans. Graph., Vol. 36, 6, Article 176 (Nov. 2017), bibinfonumpages14 pages.
[10]
Thorsten Grosch, Tobias Eble, and Stefan Mueller. 2007. Consistent interactive augmentation of live camera images with correct near-field illumination. In Proceedings of the 2007 ACM symposium on Virtual reality software and technology. ACM, 125--132.
[11]
L. Gruber, T. Langlotz, P. Sen, T. Höherer, and D. Schmalstieg. 2014. Efficient and robust radiance transfer for probeless photorealistic augmented reality. In 2014 IEEE Virtual Reality (VR) . 15--20.
[12]
Vlastimil Havran, Miloslaw Smyk, Grzegorz Krawczyk, Karol Myszkowski, and Hans-Peter Seidel. 2005. Interactive System for Dynamic Scene Lighting using Captured Video Environment Maps. In Rendering Techniques. 31--42.
[13]
Google Inc. 2018. ARCore API. Online{Cited: August 6, 2018.}: https://developers.google.com/ar/discover/ (2018).
[14]
Kevin Karsch, Varsha Hedau, David Forsyth, and Derek Hoiem. 2011. Rendering Synthetic Objects into Legacy Photographs. In Proceedings of the 2011 SIGGRAPH Asia Conference (SA '11). ACM, New York, NY, USA, Article 157, bibinfonumpages12 pages.
[15]
Kevin Karsch, Kalyan Sunkavalli, Sunil Hadap, Nathan Carr, Hailin Jin, Rafael Fonte, Michael Sittig, and David Forsyth. 2014. Automatic Scene Inference for 3D Object Compositing. ACM Trans. Graph., Vol. 33, 3, Article 32 (June 2014), bibinfonumpages15 pages.
[16]
Erum Arif Khan, Erik Reinhard, Roland W. Fleming, and Heinrich H. Bülthoff. 2006. Image-based Material Editing. In ACM SIGGRAPH 2006 Papers (SIGGRAPH '06). ACM, New York, NY, USA, 654--663.
[17]
Natasha Kholgade, Tomas Simon, Alexei Efros, and Yaser Sheikh. 2014. 3D Object Manipulation in a Single Photograph Using Stock 3D Models. ACM Trans. Graph., Vol. 33, 4, Article 127 (July 2014), bibinfonumpages12 pages.
[18]
J. Lalonde, A. A. Efros, and S. G. Narasimhan. 2009a. Estimating natural illumination from a single outdoor image. In 2009 IEEE 12th International Conference on Computer Vision. 183--190.
[19]
Jean-Franccois Lalonde, Alexei A. Efros, and Srinivasa G. Narasimhan. 2009b. Webcam Clip Art: Appearance and Illuminant Transfer from Time-lapse Sequences. In ACM SIGGRAPH Asia 2009 Papers (SIGGRAPH Asia '09). ACM, New York, NY, USA, Article 131, bibinfonumpages10 pages.
[20]
D. Mandl, K. M. Yi, P. Mohr, P. M. Roth, P. Fua, V. Lepetit, D. Schmalstieg, and D. Kalkofen. 2017. Learning Lightprobes for Mixed Reality Illumination. In 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 82--89.
[21]
Ko Nishino and Shree K. Nayar. 2004. Eyes for Relighting. In ACM SIGGRAPH 2004 Papers (SIGGRAPH '04). ACM, New York, NY, USA, 704--711.
[22]
D. Nowrouzezahrai, S. Geiger, K. Mitchell, R. Sumner, W. Jarosz, and M. Gross. 2011. Light factorization for mixed-frequency shadows in augmented reality. In 2011 10th IEEE International Symposium on Mixed and Augmented Reality. 173--179.
[23]
Ravi Ramamoorthi and Pat Hanrahan. 2001. An efficient representation for irradiance environment maps. In Proceedings of the 28th annual conference on Computer graphics and interactive techniques. ACM, 497--500.
[24]
T. Richter-Trummer, D. Kalkofen, J. Park, and D. Schmalstieg. 2016. Instant Mixed Reality Lighting from Casual Scanning. In 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 27--36.
[25]
K. Rohmer, W. Büschel, R. Dachselt, and T. Grosch. 2014. Interactive near-field illumination for photorealistic augmented reality on mobile devices. In 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 29--38.
[26]
Imari Sato, Yoichi Sato, and Katsushi Ikeuchi. 1999. Acquiring a radiance distribution to superimpose virtual objects onto a real scene. IEEE transactions on visualization and computer graphics, Vol. 5, 1 (1999), 1--12.
[27]
Donald Shepard. 1968. A Two-dimensional Interpolation Function for Irregularly-spaced Data. In Proceedings of the 1968 23rd ACM National Conference (ACM '68). ACM, New York, NY, USA, 517--524.
[28]
Peter-Pike Sloan, Jan Kautz, and John Snyder. 2002. Precomputed Radiance Transfer for Real-time Rendering in Dynamic, Low-frequency Lighting Environments. In Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '02). ACM, New York, NY, USA, 527--536.
[29]
Jonas Unger, Stefan Gustavson, Per Larsson, and Anders Ynnerman. 2008. Free form incident light fields. In Computer Graphics Forum, Vol. 27. Wiley Online Library, 1293--1301.
[30]
Jonas Unger, Stefan Gustavson, and Anders Ynnerman. 2007. Spatially varying image based lighting by light probe sequences. The Visual Computer, Vol. 23, 7 (2007), 453--465.
[31]
Jonas Unger, Joel Kronander, Per Larsson, Stefan Gustavson, and Anders Ynnerman. 2013. Temporally and spatially varying image based lighting using hdr-video. In Signal Processing Conference (EUSIPCO), 2013 Proceedings of the 21st European. IEEE, 1--5.
[32]
Jonas Unger, Andreas Wenger, Tim Hawkins, Andrew Gardner, and Paul Debevec. 2003. Capturing and rendering with incident light fields . Technical Report. University of Southern California Marina Del Rey CA Institute for Creative Technologies.
[33]
Guanyu Xing, Xuehong Zhou, Qunsheng Peng, Yanli Liu, and Xueying Qin. 2013. Lighting simulation of augmented outdoor scene based on a legacy photograph. In Computer Graphics Forum, Vol. 32. Wiley Online Library, 101--110.
[34]
Edward Zhang, Michael F. Cohen, and Brian Curless. 2016. Emptying, Refurnishing, and Relighting Indoor Spaces. ACM Trans. Graph., Vol. 35, 6, Article 174 (Nov. 2016), bibinfonumpages14 pages.

Cited By

View all
  • (2024)Real-Time Lighting Effects for Consumer-Grade Mobile Graphics HardwareIEEE Transactions on Consumer Electronics10.1109/TCE.2023.332805170:1(338-349)Online publication date: Mar-2024
  • (2024)HarvAR: Mobile Augmented-Reality-Assisted Photovoltaic Energy-Harvesting Sensor ManagementIEEE Internet of Things Journal10.1109/JIOT.2024.340216811:17(28591-28604)Online publication date: 1-Sep-2024
  • (2023)Multi-Camera Lighting Estimation for Photorealistic Front-Facing Mobile Augmented RealityProceedings of the 24th International Workshop on Mobile Computing Systems and Applications10.1145/3572864.3580337(68-73)Online publication date: 22-Feb-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
MobiSys '19: Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services
June 2019
736 pages
ISBN:9781450366618
DOI:10.1145/3307334
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 June 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. augmented reality
  2. geometry
  3. image processing
  4. image-based lighting
  5. light estimation
  6. light probe
  7. lighting models

Qualifiers

  • Research-article

Conference

MobiSys '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 274 of 1,679 submissions, 16%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)37
  • Downloads (Last 6 weeks)2
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Real-Time Lighting Effects for Consumer-Grade Mobile Graphics HardwareIEEE Transactions on Consumer Electronics10.1109/TCE.2023.332805170:1(338-349)Online publication date: Mar-2024
  • (2024)HarvAR: Mobile Augmented-Reality-Assisted Photovoltaic Energy-Harvesting Sensor ManagementIEEE Internet of Things Journal10.1109/JIOT.2024.340216811:17(28591-28604)Online publication date: 1-Sep-2024
  • (2023)Multi-Camera Lighting Estimation for Photorealistic Front-Facing Mobile Augmented RealityProceedings of the 24th International Workshop on Mobile Computing Systems and Applications10.1145/3572864.3580337(68-73)Online publication date: 22-Feb-2023
  • (2023)MobiSpectral: Hyperspectral Imaging on Mobile DevicesProceedings of the 29th Annual International Conference on Mobile Computing and Networking10.1145/3570361.3613296(1-15)Online publication date: 2-Oct-2023
  • (2023)U-DiVE - design and evaluation of a distributed photorealistic virtual reality environmentMultimedia Tools and Applications10.1007/s11042-023-15064-y82:22(34129-34145)Online publication date: 27-Mar-2023
  • (2023)LiteAR: A Framework to Estimate Lighting for Mixed Reality Sessions for Enhanced RealismAdvances in Computer Graphics10.1007/978-3-031-23473-6_32(407-423)Online publication date: 1-Jan-2023
  • (2021)LensCapProceedings of the 19th Annual International Conference on Mobile Systems, Applications, and Services10.1145/3458864.3467676(14-27)Online publication date: 24-Jun-2021
  • (2021)Light4AR: a Shadow-based Estimator of Multiple Light Sources in Interactive Time for More Photorealistic AR Experiences2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct54149.2021.00069(304-309)Online publication date: Oct-2021
  • (2021)3D location estimation of light sources in room-scale scenes2021 International Conference on 3D Immersion (IC3D)10.1109/IC3D53758.2021.9687218(1-8)Online publication date: 8-Dec-2021
  • (2020)ARCHIE: A User-Focused Framework for Testing Augmented Reality Applications in the Wild2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR46266.2020.1581006269928(903-912)Online publication date: Mar-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media