GIR: 3D Gaussian Inverse Rendering for Relightable Scene Factorization
- URL: http://arxiv.org/abs/2312.05133v1
- Date: Fri, 8 Dec 2023 16:05:15 GMT
- Title: GIR: 3D Gaussian Inverse Rendering for Relightable Scene Factorization
- Authors: Yahao Shi, Yanmin Wu, Chenming Wu, Xing Liu, Chen Zhao, Haocheng Feng,
Jingtuo Liu, Liangjun Zhang, Jian Zhang, Bin Zhou, Errui Ding, Jingdong Wang
- Abstract summary: GIR is a 3D Gaussian Inverse Rendering method for relightable scene factorization.
Our method utilizes 3D Gaussians to estimate the material properties, illumination, and geometry of an object from multi-view images.
- Score: 76.52007427483396
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents GIR, a 3D Gaussian Inverse Rendering method for
relightable scene factorization. Compared to existing methods leveraging
discrete meshes or neural implicit fields for inverse rendering, our method
utilizes 3D Gaussians to estimate the material properties, illumination, and
geometry of an object from multi-view images. Our study is motivated by the
evidence showing that 3D Gaussian is a more promising backbone than neural
fields in terms of performance, versatility, and efficiency. In this paper, we
aim to answer the question: ``How can 3D Gaussian be applied to improve the
performance of inverse rendering?'' To address the complexity of estimating
normals based on discrete and often in-homogeneous distributed 3D Gaussian
representations, we proposed an efficient self-regularization method that
facilitates the modeling of surface normals without the need for additional
supervision. To reconstruct indirect illumination, we propose an approach that
simulates ray tracing. Extensive experiments demonstrate our proposed GIR's
superior performance over existing methods across multiple tasks on a variety
of widely used datasets in inverse rendering. This substantiates its efficacy
and broad applicability, highlighting its potential as an influential tool in
relighting and reconstruction. Project page: https://3dgir.github.io
Related papers
- 3D Gaussian Inverse Rendering with Approximated Global Illumination [15.899514468603627]
We present a novel approach that enables efficient global illumination for 3D Gaussians Splatting through screen-space ray tracing.
Our key insight is that a substantial amount of indirect light can be traced back to surfaces visible within the current view frustum.
In experiments, we show that the screen-space approximation we utilize allows for indirect illumination and supports real-time rendering and editing.
arXiv Detail & Related papers (2025-04-02T05:02:25Z) - GUS-IR: Gaussian Splatting with Unified Shading for Inverse Rendering [83.69136534797686]
We present GUS-IR, a novel framework designed to address the inverse rendering problem for complicated scenes featuring rough and glossy surfaces.
This paper starts by analyzing and comparing two prominent shading techniques popularly used for inverse rendering, forward shading and deferred shading.
We propose a unified shading solution that combines the advantages of both techniques for better decomposition.
arXiv Detail & Related papers (2024-11-12T01:51:05Z) - GeoSplatting: Towards Geometry Guided Gaussian Splatting for Physically-based Inverse Rendering [69.67264955234494]
GeoSplatting is a novel approach that augments 3DGS with explicit geometry guidance for precise light transport modeling.<n>By differentiably constructing a surface-grounded 3DGS from an optimizable mesh, our approach leverages well-defined mesh normals and the opaque mesh surface.<n>This enhancement ensures precise material decomposition while preserving the efficiency and high-quality rendering capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-31T17:57:07Z) - L3DG: Latent 3D Gaussian Diffusion [74.36431175937285]
L3DG is the first approach for generative 3D modeling of 3D Gaussians through a latent 3D Gaussian diffusion formulation.
We employ a sparse convolutional architecture to efficiently operate on room-scale scenes.
By leveraging the 3D Gaussian representation, the generated scenes can be rendered from arbitrary viewpoints in real-time.
arXiv Detail & Related papers (2024-10-17T13:19:32Z) - BiGS: Bidirectional Gaussian Primitives for Relightable 3D Gaussian Splatting [10.918133974256913]
We present Bidirectional Gaussian Primitives, an image-based novel view synthesis technique.
Our approach integrates light intrinsic decomposition into the Gaussian splatting framework, enabling real-time relighting of 3D objects.
arXiv Detail & Related papers (2024-08-23T21:04:40Z) - Subsurface Scattering for 3D Gaussian Splatting [10.990813043493642]
3D reconstruction and relighting of objects made from scattering materials present a significant challenge due to the complex light transport beneath the surface.
We propose a framework for optimizing an object's shape together with the radiance transfer field given multi-view OLAT (one light at a time) data.
Our approach enables material editing, relighting and novel view synthesis at interactive rates.
arXiv Detail & Related papers (2024-08-22T10:34:01Z) - PRTGaussian: Efficient Relighting Using 3D Gaussians with Precomputed Radiance Transfer [13.869132334647771]
PRTGaussian is a realtime relightable novel-view synthesis method.
By fitting relightable Gaussians to multi-view OLAT data, our method enables real-time, free-viewpoint relighting.
arXiv Detail & Related papers (2024-08-10T20:57:38Z) - GS-Phong: Meta-Learned 3D Gaussians for Relightable Novel View Synthesis [63.5925701087252]
We propose a novel method for representing a scene illuminated by a point light using a set of relightable 3D Gaussian points.
Inspired by the Blinn-Phong model, our approach decomposes the scene into ambient, diffuse, and specular components.
To facilitate the decomposition of geometric information independent of lighting conditions, we introduce a novel bilevel optimization-based meta-learning framework.
arXiv Detail & Related papers (2024-05-31T13:48:54Z) - Relightable 3D Gaussians: Realistic Point Cloud Relighting with BRDF Decomposition and Ray Tracing [21.498078188364566]
We present a novel differentiable point-based rendering framework to achieve photo-realistic relighting.
The proposed framework showcases the potential to revolutionize the mesh-based graphics pipeline with a point-based pipeline enabling editing, tracing, and relighting.
arXiv Detail & Related papers (2023-11-27T18:07:58Z) - GS-SLAM: Dense Visual SLAM with 3D Gaussian Splatting [51.96353586773191]
We introduce textbfGS-SLAM that first utilizes 3D Gaussian representation in the Simultaneous Localization and Mapping system.
Our method utilizes a real-time differentiable splatting rendering pipeline that offers significant speedup to map optimization and RGB-D rendering.
Our method achieves competitive performance compared with existing state-of-the-art real-time methods on the Replica, TUM-RGBD datasets.
arXiv Detail & Related papers (2023-11-20T12:08:23Z) - Extracting Triangular 3D Models, Materials, and Lighting From Images [59.33666140713829]
We present an efficient method for joint optimization of materials and lighting from multi-view image observations.
We leverage meshes with spatially-varying materials and environment that can be deployed in any traditional graphics engine.
arXiv Detail & Related papers (2021-11-24T13:58:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.