PRTGS: Precomputed Radiance Transfer of Gaussian Splats for Real-Time High-Quality Relighting
- URL: http://arxiv.org/abs/2408.03538v1
- Date: Wed, 7 Aug 2024 04:35:06 GMT
- Title: PRTGS: Precomputed Radiance Transfer of Gaussian Splats for Real-Time High-Quality Relighting
- Authors: Yijia Guo, Yuanxi Bai, Liwen Hu, Ziyi Guo, Mianzhi Liu, Yu Cai, Tiejun Huang, Lei Ma,
- Abstract summary: We propose a real-time high-quality relighting method for Gaussian splats in low-frequency lighting environments.
This method captures soft shadows and interreflections by precomputing 3D Gaussian splats' radiance transfer.
We introduce distinct precomputing methods tailored for training and rendering stages, along with unique ray tracing and indirect lighting precomputation techniques.
- Score: 24.48524333222271
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We proposed Precomputed RadianceTransfer of GaussianSplats (PRTGS), a real-time high-quality relighting method for Gaussian splats in low-frequency lighting environments that captures soft shadows and interreflections by precomputing 3D Gaussian splats' radiance transfer. Existing studies have demonstrated that 3D Gaussian splatting (3DGS) outperforms neural fields' efficiency for dynamic lighting scenarios. However, the current relighting method based on 3DGS still struggles to compute high-quality shadow and indirect illumination in real time for dynamic light, leading to unrealistic rendering results. We solve this problem by precomputing the expensive transport simulations required for complex transfer functions like shadowing, the resulting transfer functions are represented as dense sets of vectors or matrices for every Gaussian splat. We introduce distinct precomputing methods tailored for training and rendering stages, along with unique ray tracing and indirect lighting precomputation techniques for 3D Gaussian splats to accelerate training speed and compute accurate indirect lighting related to environment light. Experimental analyses demonstrate that our approach achieves state-of-the-art visual quality while maintaining competitive training times and allows high-quality real-time (30+ fps) relighting for dynamic light and relatively complex scenes at 1080p resolution.
Related papers
- GI-GS: Global Illumination Decomposition on Gaussian Splatting for Inverse Rendering [6.820642721852439]
We present GI-GS, a novel inverse rendering framework that leverages 3D Gaussian Splatting (3DGS) and deferred shading.
In our framework, we first render a G-buffer to capture the detailed geometry and material properties of the scene.
With the G-buffer and previous rendering results, the indirect lighting can be calculated through a lightweight path tracing.
arXiv Detail & Related papers (2024-10-03T15:58:18Z) - Subsurface Scattering for 3D Gaussian Splatting [10.990813043493642]
3D reconstruction and relighting of objects made from scattering materials present a significant challenge due to the complex light transport beneath the surface.
We propose a framework for optimizing an object's shape together with the radiance transfer field given multi-view OLAT (one light at a time) data.
Our approach enables material editing, relighting and novel view synthesis at interactive rates.
arXiv Detail & Related papers (2024-08-22T10:34:01Z) - PRTGaussian: Efficient Relighting Using 3D Gaussians with Precomputed Radiance Transfer [13.869132334647771]
PRTGaussian is a realtime relightable novel-view synthesis method.
By fitting relightable Gaussians to multi-view OLAT data, our method enables real-time, free-viewpoint relighting.
arXiv Detail & Related papers (2024-08-10T20:57:38Z) - 3iGS: Factorised Tensorial Illumination for 3D Gaussian Splatting [15.059156311856087]
3D Gaussian Splatting, or 3iGS, improves upon 3D Gaussian Splatting (3DGS) rendering quality.
Use of 3D Gaussians as representation of radiance fields has enabled high quality novel view synthesis at real-time rendering speed.
arXiv Detail & Related papers (2024-08-07T13:06:29Z) - DeferredGS: Decoupled and Editable Gaussian Splatting with Deferred Shading [50.331929164207324]
We introduce DeferredGS, a method for decoupling and editing the Gaussian splatting representation using deferred shading.
Both qualitative and quantitative experiments demonstrate the superior performance of DeferredGS in novel view and editing tasks.
arXiv Detail & Related papers (2024-04-15T01:58:54Z) - CVT-xRF: Contrastive In-Voxel Transformer for 3D Consistent Radiance Fields from Sparse Inputs [65.80187860906115]
We propose a novel approach to improve NeRF's performance with sparse inputs.
We first adopt a voxel-based ray sampling strategy to ensure that the sampled rays intersect with a certain voxel in 3D space.
We then randomly sample additional points within the voxel and apply a Transformer to infer the properties of other points on each ray, which are then incorporated into the volume rendering.
arXiv Detail & Related papers (2024-03-25T15:56:17Z) - GIR: 3D Gaussian Inverse Rendering for Relightable Scene Factorization [62.13932669494098]
This paper presents a 3D Gaussian Inverse Rendering (GIR) method, employing 3D Gaussian representations to factorize the scene into material properties, light, and geometry.
We compute the normal of each 3D Gaussian using the shortest eigenvector, with a directional masking scheme forcing accurate normal estimation without external supervision.
We adopt an efficient voxel-based indirect illumination tracing scheme that stores direction-aware outgoing radiance in each 3D Gaussian to disentangle secondary illumination for approximating multi-bounce light transport.
arXiv Detail & Related papers (2023-12-08T16:05:15Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z) - Spatiotemporally Consistent HDR Indoor Lighting Estimation [66.26786775252592]
We propose a physically-motivated deep learning framework to solve the indoor lighting estimation problem.
Given a single LDR image with a depth map, our method predicts spatially consistent lighting at any given image position.
Our framework achieves photorealistic lighting prediction with higher quality compared to state-of-the-art single-image or video-based methods.
arXiv Detail & Related papers (2023-05-07T20:36:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.