Reflective Gaussian Splatting
- URL: http://arxiv.org/abs/2412.19282v2
- Date: Mon, 03 Feb 2025 13:34:08 GMT
- Title: Reflective Gaussian Splatting
- Authors: Yuxuan Yao, Zixuan Zeng, Chun Gu, Xiatian Zhu, Li Zhang,
- Abstract summary: We introduce a Reflective Gaussian splatting (Ref-Gaussian) framework characterized with two components.
Ref-Gaussian surpasses existing approaches in terms of quantitative metrics, visual quality, and compute efficiency.
Also, we show that Ref-Gaussian supports more applications such as relighting and editing.
- Score: 36.111845416439095
- License:
- Abstract: Novel view synthesis has experienced significant advancements owing to increasingly capable NeRF- and 3DGS-based methods. However, reflective object reconstruction remains challenging, lacking a proper solution to achieve real-time, high-quality rendering while accommodating inter-reflection. To fill this gap, we introduce a Reflective Gaussian splatting (Ref-Gaussian) framework characterized with two components: (I) Physically based deferred rendering that empowers the rendering equation with pixel-level material properties via formulating split-sum approximation; (II) Gaussian-grounded inter-reflection that realizes the desired inter-reflection function within a Gaussian splatting paradigm for the first time. To enhance geometry modeling, we further introduce material-aware normal propagation and an initial per-Gaussian shading stage, along with 2D Gaussian primitives. Extensive experiments on standard datasets demonstrate that Ref-Gaussian surpasses existing approaches in terms of quantitative metrics, visual quality, and compute efficiency. Further, we show that our method serves as a unified solution for both reflective and non-reflective scenes, going beyond the previous alternatives focusing on only reflective scenes. Also, we illustrate that Ref-Gaussian supports more applications such as relighting and editing.
Related papers
- IRGS: Inter-Reflective Gaussian Splatting with 2D Gaussian Ray Tracing [3.147103287687791]
We introduce inter-reflective Gaussian splatting (IRGS) for inverse rendering.
We apply the full rendering equation without simplification and compute incident radiance on the fly.
Furthermore, we introduce a novel strategy for querying the indirect radiance of incident light when relighting the optimized scenes.
arXiv Detail & Related papers (2024-12-20T13:10:43Z) - EnvGS: Modeling View-Dependent Appearance with Environment Gaussian [78.74634059559891]
EnvGS is a novel approach that employs a set of Gaussian primitives as an explicit 3D representation for capturing reflections of environments.
To efficiently render these environment Gaussian primitives, we developed a ray-tracing-based reflection that leverages the GPU's RT core for fast rendering.
Results from multiple real-world and synthetic datasets demonstrate that our method produces significantly more detailed reflections.
arXiv Detail & Related papers (2024-12-19T18:59:57Z) - Ref-GS: Directional Factorization for 2D Gaussian Splatting [21.205003186833096]
Ref-GS is a novel approach for directional light factorization in 2D Gaussian splatting.
Our method achieves superior rendering for a range of open-world scenes while also accurately recovering geometry.
arXiv Detail & Related papers (2024-12-01T17:43:32Z) - GUS-IR: Gaussian Splatting with Unified Shading for Inverse Rendering [83.69136534797686]
We present GUS-IR, a novel framework designed to address the inverse rendering problem for complicated scenes featuring rough and glossy surfaces.
This paper starts by analyzing and comparing two prominent shading techniques popularly used for inverse rendering, forward shading and deferred shading.
We propose a unified shading solution that combines the advantages of both techniques for better decomposition.
arXiv Detail & Related papers (2024-11-12T01:51:05Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.
Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - RefGaussian: Disentangling Reflections from 3D Gaussian Splatting for Realistic Rendering [18.427759763663047]
We propose RefGaussian to disentangle reflections from 3D-GS for realistically modeling reflections.
We employ local regularization techniques to ensure local smoothness for both the transmitted and reflected components.
Our approach achieves superior novel view synthesis and accurate depth estimation outcomes.
arXiv Detail & Related papers (2024-06-09T16:49:39Z) - 3D Gaussian Splatting with Deferred Reflection [25.254842246219585]
We present a deferred shading method to render specular reflection with Gaussian splatting.
Our method significantly outperforms state-of-the-art techniques and concurrent work in synthesizing high-quality specular reflection effects.
arXiv Detail & Related papers (2024-04-29T06:24:32Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z) - Two-Stage Single Image Reflection Removal with Reflection-Aware Guidance [78.34235841168031]
We present a novel two-stage network with reflection-aware guidance (RAGNet) for single image reflection removal (SIRR)
RAG can be used (i) to mitigate the effect of reflection from the observation, and (ii) to generate mask in partial convolution for mitigating the effect of deviating from linear combination hypothesis.
Experiments on five commonly used datasets demonstrate the quantitative and qualitative superiority of our RAGNet in comparison to the state-of-the-art SIRR methods.
arXiv Detail & Related papers (2020-12-02T03:14:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.