Loading web-font TeX/Main/Regular
Learning Spherical Radiance Field for Efficient 360° Unbounded Novel View Synthesis | IEEE Journals & Magazine | IEEE Xplore

Learning Spherical Radiance Field for Efficient 360° Unbounded Novel View Synthesis


Abstract:

Novel view synthesis aims at rendering any posed images from sparse observations of the scene. Recently, neural radiance fields (NeRF) have demonstrated their effectivene...Show More

Abstract:

Novel view synthesis aims at rendering any posed images from sparse observations of the scene. Recently, neural radiance fields (NeRF) have demonstrated their effectiveness in synthesizing novel views of a bounded scene. However, most existing methods cannot be directly extended to 360° unbounded scenes where the camera orientations and scene depths are unconstrained with large variations. In this paper, we present a spherical radiance field (SRF) for efficient novel view synthesis in 360° unbounded scenes. Specifically, we represent a 3D scene as multiple concentric spheres with different radii. In particular, each sphere encodes its corresponding layered scene into implicit representations and is parameterized with an equirectangular projection image. A shallow multi-layer perceptron (MLP) is then used to infer the density and color from these sphere representations for volume rendering. Moreover, an occupancy grid is introduced to cache the density field and guide the ray sampling, which accelerates the training and rendering procedures by reducing the number of samples along the ray. Experiments show that our method can well fit 360° unbounded scenes and produces state-of-the-art results on three benchmark datasets with less than 30 minutes of training time on a 3090 GPU, surpassing Mip-NeRF 360 with a 400\times speedup. In addition, our method achieves competitive performance in terms of both accuracy and efficiency on a bounded dataset. Project page: https://minglin-chen.github.io/SphericalRF
Published in: IEEE Transactions on Image Processing ( Volume: 33)
Page(s): 3722 - 3734
Date of Publication: 10 June 2024

ISSN Information:

PubMed ID: 38857135

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.