Radiometrically Consistent Gaussian Surfels for Inverse Rendering
- URL: http://arxiv.org/abs/2603.01491v1
- Date: Mon, 02 Mar 2026 06:00:54 GMT
- Title: Radiometrically Consistent Gaussian Surfels for Inverse Rendering
- Authors: Kyu Beom Han, Jaeyoon Kim, Woo Jae Kim, Jinhwan Seo, Sung-eui Yoon,
- Abstract summary: Inverse rendering with Gaussian Splatting has advanced rapidly, but accurately disentangling material properties remains a major challenge.<n>We introduce radiometric consistency, a novel physically-based constraint that provides supervision towards unobserved views.<n>We then propose Radiometrically Consistent Gaussian Surfels (RadioGS), an inverse rendering framework built upon our principle.
- Score: 24.969420082811094
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Inverse rendering with Gaussian Splatting has advanced rapidly, but accurately disentangling material properties from complex global illumination effects, particularly indirect illumination, remains a major challenge. Existing methods often query indirect radiance from Gaussian primitives pre-trained for novel-view synthesis. However, these pre-trained Gaussian primitives are supervised only towards limited training viewpoints, thus lack supervision for modeling indirect radiances from unobserved views. To address this issue, we introduce radiometric consistency, a novel physically-based constraint that provides supervision towards unobserved views by minimizing the residual between each Gaussian primitive's learned radiance and its physically-based rendered counterpart. Minimizing the residual for unobserved views establishes a self-correcting feedback loop that provides supervision from both physically-based rendering and novel-view synthesis, enabling accurate modeling of inter-reflection. We then propose Radiometrically Consistent Gaussian Surfels (RadioGS), an inverse rendering framework built upon our principle by efficiently integrating radiometric consistency by utilizing Gaussian surfels and 2D Gaussian ray tracing. We further propose a finetuning-based relighting strategy that adapts Gaussian surfel radiances to new illuminations within minutes, achieving low rendering cost (<10ms). Extensive experiments on existing inverse rendering benchmarks show that RadioGS outperforms existing Gaussian-based methods in inverse rendering, while retaining the computational efficiency.
Related papers
- LiDAR-GS++:Improving LiDAR Gaussian Reconstruction via Diffusion Priors [51.724649822336346]
We present LiDAR-GS++, a reconstruction method enhanced by diffusion priors for real-time and high-fidelity re-simulation.<n>Specifically, we introduce a controllable LiDAR generation model conditioned on coarsely extrapolated rendering to produce extra geometry-consistent scans.<n>By extending reconstruction to under-fitted regions, our approach ensures global geometric consistency for extrapolative novel views.
arXiv Detail & Related papers (2025-11-15T17:33:12Z) - MaterialRefGS: Reflective Gaussian Splatting with Multi-view Consistent Material Inference [83.38607296779423]
We show that multi-view consistent material inference with more physically-based environment modeling is key to learning accurate reflections with Gaussian Splatting.<n>Our method faithfully recovers both illumination and geometry, achieving state-of-the-art rendering quality in novel views synthesis.
arXiv Detail & Related papers (2025-10-13T13:29:20Z) - Differentiable Light Transport with Gaussian Surfels via Adapted Radiosity for Efficient Relighting and Geometry Reconstruction [32.713220877091565]
Radiance fields have gained tremendous success with applications ranging from novel view synthesis to geometry reconstruction.<n>One way to address these limitations is to incorporate physically-based rendering.<n>It has been prohibitively expensive to include full global illumination within the inner loop of the optimization.
arXiv Detail & Related papers (2025-09-23T01:02:31Z) - On the Skinning of Gaussian Avatars [6.915151148382041]
Slow rendering and backward mapping from the observation space to the canonical space have been the main challenges.<n>We propose a weighted rotation blending approach that leverages quaternion averaging.
arXiv Detail & Related papers (2025-09-14T19:58:48Z) - Reflective Gaussian Splatting [36.111845416439095]
We introduce a Reflective Gaussian splatting (Ref-Gaussian) framework characterized with two components.<n>Ref-Gaussian surpasses existing approaches in terms of quantitative metrics, visual quality, and compute efficiency.<n>Also, we show that Ref-Gaussian supports more applications such as relighting and editing.
arXiv Detail & Related papers (2024-12-26T16:58:35Z) - IRGS: Inter-Reflective Gaussian Splatting with 2D Gaussian Ray Tracing [3.147103287687791]
We introduce inter-reflective Gaussian splatting (IRGS) for inverse rendering.<n>We apply the full rendering equation without simplification and compute incident radiance on the fly.<n> Furthermore, we introduce a novel strategy for querying the indirect radiance of incident light when relighting the optimized scenes.
arXiv Detail & Related papers (2024-12-20T13:10:43Z) - Binocular-Guided 3D Gaussian Splatting with View Consistency for Sparse View Synthesis [53.702118455883095]
We propose a novel method for synthesizing novel views from sparse views with Gaussian Splatting.
Our key idea lies in exploring the self-supervisions inherent in the binocular stereo consistency between each pair of binocular images.
Our method significantly outperforms the state-of-the-art methods.
arXiv Detail & Related papers (2024-10-24T15:10:27Z) - 3D Gaussian Splatting with Deferred Reflection [25.254842246219585]
We present a deferred shading method to render specular reflection with Gaussian splatting.
Our method significantly outperforms state-of-the-art techniques and concurrent work in synthesizing high-quality specular reflection effects.
arXiv Detail & Related papers (2024-04-29T06:24:32Z) - DeferredGS: Decoupled and Editable Gaussian Splatting with Deferred Shading [50.331929164207324]
We introduce DeferredGS, a method for decoupling and editing the Gaussian splatting representation using deferred shading.
Both qualitative and quantitative experiments demonstrate the superior performance of DeferredGS in novel view and editing tasks.
arXiv Detail & Related papers (2024-04-15T01:58:54Z) - IntrinsicNeRF: Learning Intrinsic Neural Radiance Fields for Editable
Novel View Synthesis [90.03590032170169]
We present intrinsic neural radiance fields, dubbed IntrinsicNeRF, which introduce intrinsic decomposition into the NeRF-based neural rendering method.
Our experiments and editing samples on both object-specific/room-scale scenes and synthetic/real-word data demonstrate that we can obtain consistent intrinsic decomposition results.
arXiv Detail & Related papers (2022-10-02T22:45:11Z) - InfoNeRF: Ray Entropy Minimization for Few-Shot Neural Volume Rendering [55.70938412352287]
We present an information-theoretic regularization technique for few-shot novel view synthesis based on neural implicit representation.
The proposed approach minimizes potential reconstruction inconsistency that happens due to insufficient viewpoints.
We achieve consistently improved performance compared to existing neural view synthesis methods by large margins on multiple standard benchmarks.
arXiv Detail & Related papers (2021-12-31T11:56:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.