PRTGaussian: Efficient Relighting Using 3D Gaussians with Precomputed Radiance Transfer
- URL: http://arxiv.org/abs/2408.05631v1
- Date: Sat, 10 Aug 2024 20:57:38 GMT
- Title: PRTGaussian: Efficient Relighting Using 3D Gaussians with Precomputed Radiance Transfer
- Authors: Libo Zhang, Yuxuan Han, Wenbin Lin, Jingwang Ling, Feng Xu,
- Abstract summary: PRTGaussian is a realtime relightable novel-view synthesis method.
By fitting relightable Gaussians to multi-view OLAT data, our method enables real-time, free-viewpoint relighting.
- Score: 13.869132334647771
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present PRTGaussian, a realtime relightable novel-view synthesis method made possible by combining 3D Gaussians and Precomputed Radiance Transfer (PRT). By fitting relightable Gaussians to multi-view OLAT data, our method enables real-time, free-viewpoint relighting. By estimating the radiance transfer based on high-order spherical harmonics, we achieve a balance between capturing detailed relighting effects and maintaining computational efficiency. We utilize a two-stage process: in the first stage, we reconstruct a coarse geometry of the object from multi-view images. In the second stage, we initialize 3D Gaussians with the obtained point cloud, then simultaneously refine the coarse geometry and learn the light transport for each Gaussian. Extensive experiments on synthetic datasets show that our approach can achieve fast and high-quality relighting for general objects. Code and data are available at https://github.com/zhanglbthu/PRTGaussian.
Related papers
- GPS-Gaussian+: Generalizable Pixel-wise 3D Gaussian Splatting for Real-Time Human-Scene Rendering from Sparse Views [67.34073368933814]
We propose a generalizable Gaussian Splatting approach for high-resolution image rendering under a sparse-view camera setting.
We train our Gaussian parameter regression module on human-only data or human-scene data, jointly with a depth estimation module to lift 2D parameter maps to 3D space.
Experiments on several datasets demonstrate that our method outperforms state-of-the-art methods while achieving an exceeding rendering speed.
arXiv Detail & Related papers (2024-11-18T08:18:44Z) - L3DG: Latent 3D Gaussian Diffusion [74.36431175937285]
L3DG is the first approach for generative 3D modeling of 3D Gaussians through a latent 3D Gaussian diffusion formulation.
We employ a sparse convolutional architecture to efficiently operate on room-scale scenes.
By leveraging the 3D Gaussian representation, the generated scenes can be rendered from arbitrary viewpoints in real-time.
arXiv Detail & Related papers (2024-10-17T13:19:32Z) - BiGS: Bidirectional Gaussian Primitives for Relightable 3D Gaussian Splatting [10.918133974256913]
We present Bidirectional Gaussian Primitives, an image-based novel view synthesis technique.
Our approach integrates light intrinsic decomposition into the Gaussian splatting framework, enabling real-time relighting of 3D objects.
arXiv Detail & Related papers (2024-08-23T21:04:40Z) - Subsurface Scattering for 3D Gaussian Splatting [10.990813043493642]
3D reconstruction and relighting of objects made from scattering materials present a significant challenge due to the complex light transport beneath the surface.
We propose a framework for optimizing an object's shape together with the radiance transfer field given multi-view OLAT (one light at a time) data.
Our approach enables material editing, relighting and novel view synthesis at interactive rates.
arXiv Detail & Related papers (2024-08-22T10:34:01Z) - RTG-SLAM: Real-time 3D Reconstruction at Scale using Gaussian Splatting [51.51310922527121]
We present a real-time 3D reconstruction system with an RGBD camera for large-scale environments using Gaussian splatting.
We force each Gaussian to be either opaque or nearly transparent, with the opaque ones fitting the surface and dominant colors, and transparent ones fitting residual colors.
We show real-time reconstructions of a variety of large scenes and show superior performance in the realism of novel view synthesis and camera tracking accuracy.
arXiv Detail & Related papers (2024-04-30T16:54:59Z) - DeferredGS: Decoupled and Editable Gaussian Splatting with Deferred Shading [50.331929164207324]
We introduce DeferredGS, a method for decoupling and editing the Gaussian splatting representation using deferred shading.
Both qualitative and quantitative experiments demonstrate the superior performance of DeferredGS in novel view and editing tasks.
arXiv Detail & Related papers (2024-04-15T01:58:54Z) - Sparse-view CT Reconstruction with 3D Gaussian Volumetric Representation [13.667470059238607]
Sparse-view CT is a promising strategy for reducing the radiation dose of traditional CT scans.
Recently, 3D Gaussian has been applied to model complex natural scenes.
We investigate their potential for sparse-view CT reconstruction.
arXiv Detail & Related papers (2023-12-25T09:47:33Z) - pixelSplat: 3D Gaussian Splats from Image Pairs for Scalable Generalizable 3D Reconstruction [26.72289913260324]
pixelSplat is a feed-forward model that learns to reconstruct 3D radiance fields parameterized by 3D Gaussian primitives from pairs of images.
Our model features real-time and memory-efficient rendering for scalable training as well as fast 3D reconstruction at inference time.
arXiv Detail & Related papers (2023-12-19T17:03:50Z) - GIR: 3D Gaussian Inverse Rendering for Relightable Scene Factorization [62.13932669494098]
This paper presents a 3D Gaussian Inverse Rendering (GIR) method, employing 3D Gaussian representations to factorize the scene into material properties, light, and geometry.
We compute the normal of each 3D Gaussian using the shortest eigenvector, with a directional masking scheme forcing accurate normal estimation without external supervision.
We adopt an efficient voxel-based indirect illumination tracing scheme that stores direction-aware outgoing radiance in each 3D Gaussian to disentangle secondary illumination for approximating multi-bounce light transport.
arXiv Detail & Related papers (2023-12-08T16:05:15Z) - GPS-Gaussian: Generalizable Pixel-wise 3D Gaussian Splatting for Real-time Human Novel View Synthesis [70.24111297192057]
We present a new approach, termed GPS-Gaussian, for synthesizing novel views of a character in a real-time manner.
The proposed method enables 2K-resolution rendering under a sparse-view camera setting.
arXiv Detail & Related papers (2023-12-04T18:59:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.