3D Gaussian Inverse Rendering with Approximated Global Illumination
- URL: http://arxiv.org/abs/2504.01358v1
- Date: Wed, 02 Apr 2025 05:02:25 GMT
- Title: 3D Gaussian Inverse Rendering with Approximated Global Illumination
- Authors: Zirui Wu, Jianteng Chen, Laijian Li, Shaoteng Wu, Zhikai Zhu, Kang Xu, Martin R. Oswald, Jie Song,
- Abstract summary: We present a novel approach that enables efficient global illumination for 3D Gaussians Splatting through screen-space ray tracing.<n>Our key insight is that a substantial amount of indirect light can be traced back to surfaces visible within the current view frustum.<n>In experiments, we show that the screen-space approximation we utilize allows for indirect illumination and supports real-time rendering and editing.
- Score: 15.899514468603627
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: 3D Gaussian Splatting shows great potential in reconstructing photo-realistic 3D scenes. However, these methods typically bake illumination into their representations, limiting their use for physically-based rendering and scene editing. Although recent inverse rendering approaches aim to decompose scenes into material and lighting components, they often rely on simplifying assumptions that fail when editing. We present a novel approach that enables efficient global illumination for 3D Gaussians Splatting through screen-space ray tracing. Our key insight is that a substantial amount of indirect light can be traced back to surfaces visible within the current view frustum. Leveraging this observation, we augment the direct shading computed by 3D Gaussians with Monte-Carlo screen-space ray-tracing to capture one-bounce indirect illumination. In this way, our method enables realistic global illumination without sacrificing the computational efficiency and editability benefits of 3D Gaussians. Through experiments, we show that the screen-space approximation we utilize allows for indirect illumination and supports real-time rendering and editing. Code, data, and models will be made available at our project page: https://wuzirui.github.io/gs-ssr.
Related papers
- ODGS: 3D Scene Reconstruction from Omnidirectional Images with 3D Gaussian Splattings [48.72040500647568]
We present ODGS, a novelization pipeline for omnidirectional images, with geometric interpretation.
The entire pipeline is parallelized using, achieving optimization and speeds 100 times faster than NeRF-based methods.
Results show ODGS restores fine details effectively, even when reconstructing large 3D scenes.
arXiv Detail & Related papers (2024-10-28T02:45:13Z) - GI-GS: Global Illumination Decomposition on Gaussian Splatting for Inverse Rendering [6.820642721852439]
We present GI-GS, a novel inverse rendering framework that leverages 3D Gaussian Splatting (3DGS) and deferred shading.<n>In our framework, we first render a G-buffer to capture the detailed geometry and material properties of the scene.<n>With the G-buffer and previous rendering results, the indirect lighting can be calculated through a lightweight path tracing.
arXiv Detail & Related papers (2024-10-03T15:58:18Z) - EVER: Exact Volumetric Ellipsoid Rendering for Real-time View Synthesis [72.53316783628803]
We present Exact Volumetric Ellipsoid Rendering (EVER), a method for real-time differentiable emission-only volume rendering.
Unlike recentization based approach by 3D Gaussian Splatting (3DGS), our primitive based representation allows for exact volume rendering.
We show that our method is more accurate with blending issues than 3DGS and follow-up work on view rendering.
arXiv Detail & Related papers (2024-10-02T17:59:09Z) - 3D Gaussian Ray Tracing: Fast Tracing of Particle Scenes [50.36933474990516]
This work considers ray tracing the particles, building a bounding volume hierarchy and casting a ray for each pixel using high-performance ray tracing hardware.
To efficiently handle large numbers of semi-transparent particles, we describe a specialized algorithm which encapsulates particles with bounding meshes.
Experiments demonstrate the speed and accuracy of our approach, as well as several applications in computer graphics and vision.
arXiv Detail & Related papers (2024-07-09T17:59:30Z) - GIR: 3D Gaussian Inverse Rendering for Relightable Scene Factorization [62.13932669494098]
This paper presents a 3D Gaussian Inverse Rendering (GIR) method, employing 3D Gaussian representations to factorize the scene into material properties, light, and geometry.
We compute the normal of each 3D Gaussian using the shortest eigenvector, with a directional masking scheme forcing accurate normal estimation without external supervision.
We adopt an efficient voxel-based indirect illumination tracing scheme that stores direction-aware outgoing radiance in each 3D Gaussian to disentangle secondary illumination for approximating multi-bounce light transport.
arXiv Detail & Related papers (2023-12-08T16:05:15Z) - SplaTAM: Splat, Track & Map 3D Gaussians for Dense RGB-D SLAM [48.190398577764284]
SplaTAM is an approach to enable high-fidelity reconstruction from a single unposed RGB-D camera.
It employs a simple online tracking and mapping system tailored to the underlying Gaussian representation.
Experiments show that SplaTAM achieves up to 2x superior performance in camera pose estimation, map construction, and novel-view synthesis over existing methods.
arXiv Detail & Related papers (2023-12-04T18:53:24Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z) - Relightable 3D Gaussians: Realistic Point Cloud Relighting with BRDF Decomposition and Ray Tracing [21.498078188364566]
We present a novel differentiable point-based rendering framework to achieve photo-realistic relighting.
The proposed framework showcases the potential to revolutionize the mesh-based graphics pipeline with a point-based pipeline enabling editing, tracing, and relighting.
arXiv Detail & Related papers (2023-11-27T18:07:58Z) - Learning Indoor Inverse Rendering with 3D Spatially-Varying Lighting [149.1673041605155]
We address the problem of jointly estimating albedo, normals, depth and 3D spatially-varying lighting from a single image.
Most existing methods formulate the task as image-to-image translation, ignoring the 3D properties of the scene.
We propose a unified, learning-based inverse framework that formulates 3D spatially-varying lighting.
arXiv Detail & Related papers (2021-09-13T15:29:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.