RTR-GS: 3D Gaussian Splatting for Inverse Rendering with Radiance Transfer and Reflection
- URL: http://arxiv.org/abs/2507.07733v1
- Date: Thu, 10 Jul 2025 13:13:08 GMT
- Title: RTR-GS: 3D Gaussian Splatting for Inverse Rendering with Radiance Transfer and Reflection
- Authors: Yongyang Zhou, Fang-Lue Zhang, Zichen Wang, Lei Zhang,
- Abstract summary: RTR-GS is a novel inverse rendering framework capable of robustly rendering objects with arbitrary reflectance properties, decomposing BRDF and lighting, and delivering credible relighting results.<n>We show that our method enhances novel view synthesis, normal estimation, decomposition, and relighting while maintaining efficient training inference process.
- Score: 16.81533668816093
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: 3D Gaussian Splatting (3DGS) has demonstrated impressive capabilities in novel view synthesis. However, rendering reflective objects remains a significant challenge, particularly in inverse rendering and relighting. We introduce RTR-GS, a novel inverse rendering framework capable of robustly rendering objects with arbitrary reflectance properties, decomposing BRDF and lighting, and delivering credible relighting results. Given a collection of multi-view images, our method effectively recovers geometric structure through a hybrid rendering model that combines forward rendering for radiance transfer with deferred rendering for reflections. This approach successfully separates high-frequency and low-frequency appearances, mitigating floating artifacts caused by spherical harmonic overfitting when handling high-frequency details. We further refine BRDF and lighting decomposition using an additional physically-based deferred rendering branch. Experimental results show that our method enhances novel view synthesis, normal estimation, decomposition, and relighting while maintaining efficient training inference process.
Related papers
- Reflections Unlock: Geometry-Aware Reflection Disentanglement in 3D Gaussian Splatting for Photorealistic Scenes Rendering [51.223347330075576]
Ref-Unlock is a novel geometry-aware reflection modeling framework based on 3D Gaussian Splatting.<n>Our approach employs a dual-branch representation with high-order spherical harmonics to capture high-frequency reflective details.<n>Our method thus offers an efficient and generalizable solution for realistic rendering of reflective scenes.
arXiv Detail & Related papers (2025-07-08T15:45:08Z) - RGS-DR: Reflective Gaussian Surfels with Deferred Rendering for Shiny Objects [40.7625935521925]
RGS-DR is a novel inverse rendering method for reconstructing and rendering glossy and reflective objects.<n>It supports flexible relighting and scene editing.
arXiv Detail & Related papers (2025-04-25T16:23:50Z) - EnvGS: Modeling View-Dependent Appearance with Environment Gaussian [78.74634059559891]
EnvGS is a novel approach that employs a set of Gaussian primitives as an explicit 3D representation for capturing reflections of environments.<n>To efficiently render these environment Gaussian primitives, we developed a ray-tracing-based reflection that leverages the GPU's RT core for fast rendering.<n>Results from multiple real-world and synthetic datasets demonstrate that our method produces significantly more detailed reflections.
arXiv Detail & Related papers (2024-12-19T18:59:57Z) - GUS-IR: Gaussian Splatting with Unified Shading for Inverse Rendering [83.69136534797686]
We present GUS-IR, a novel framework designed to address the inverse rendering problem for complicated scenes featuring rough and glossy surfaces.
This paper starts by analyzing and comparing two prominent shading techniques popularly used for inverse rendering, forward shading and deferred shading.
We propose a unified shading solution that combines the advantages of both techniques for better decomposition.
arXiv Detail & Related papers (2024-11-12T01:51:05Z) - RelitLRM: Generative Relightable Radiance for Large Reconstruction Models [52.672706620003765]
We propose RelitLRM for generating high-quality Gaussian splatting representations of 3D objects under novel illuminations.
Unlike prior inverse rendering methods requiring dense captures and slow optimization, RelitLRM adopts a feed-forward transformer-based model.
We show our sparse-view feed-forward RelitLRM offers competitive relighting results to state-of-the-art dense-view optimization-based baselines.
arXiv Detail & Related papers (2024-10-08T17:40:01Z) - RefGaussian: Disentangling Reflections from 3D Gaussian Splatting for Realistic Rendering [18.427759763663047]
We propose RefGaussian to disentangle reflections from 3D-GS for realistically modeling reflections.
We employ local regularization techniques to ensure local smoothness for both the transmitted and reflected components.
Our approach achieves superior novel view synthesis and accurate depth estimation outcomes.
arXiv Detail & Related papers (2024-06-09T16:49:39Z) - 3D Gaussian Splatting with Deferred Reflection [25.254842246219585]
We present a deferred shading method to render specular reflection with Gaussian splatting.
Our method significantly outperforms state-of-the-art techniques and concurrent work in synthesizing high-quality specular reflection effects.
arXiv Detail & Related papers (2024-04-29T06:24:32Z) - Relightable 3D Gaussians: Realistic Point Cloud Relighting with BRDF Decomposition and Ray Tracing [21.498078188364566]
We present a novel differentiable point-based rendering framework to achieve photo-realistic relighting.
The proposed framework showcases the potential to revolutionize the mesh-based graphics pipeline with a point-based pipeline enabling editing, tracing, and relighting.
arXiv Detail & Related papers (2023-11-27T18:07:58Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z) - VolRecon: Volume Rendering of Signed Ray Distance Functions for
Generalizable Multi-View Reconstruction [64.09702079593372]
VolRecon is a novel generalizable implicit reconstruction method with Signed Ray Distance Function (SRDF)
On DTU dataset, VolRecon outperforms SparseNeuS by about 30% in sparse view reconstruction and achieves comparable accuracy as MVSNet in full view reconstruction.
arXiv Detail & Related papers (2022-12-15T18:59:54Z) - SupeRVol: Super-Resolution Shape and Reflectance Estimation in Inverse
Volume Rendering [42.0782248214221]
SupeRVol is an inverse rendering pipeline that allows us to recover 3D shape and material parameters from a set of color images in a super-resolution manner.
It generates reconstructions that are sharper than the individual input images, making this method ideally suited for 3D modeling from low-resolution imagery.
arXiv Detail & Related papers (2022-12-09T16:30:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.