SIRe-IR: Inverse Rendering for BRDF Reconstruction with Shadow and
Illumination Removal in High-Illuminance Scenes
- URL: http://arxiv.org/abs/2310.13030v2
- Date: Sun, 19 Nov 2023 15:47:18 GMT
- Title: SIRe-IR: Inverse Rendering for BRDF Reconstruction with Shadow and
Illumination Removal in High-Illuminance Scenes
- Authors: Ziyi Yang, Yanzhen Chen, Xinyu Gao, Yazhen Yuan, Yu Wu, Xiaowei Zhou,
Xiaogang Jin
- Abstract summary: We present SIRe-IR, an implicit neural rendering inverse approach that decomposes the scene into environment map, albedo, and roughness.
By accurately modeling the indirect radiance field, normal, visibility, and direct light simultaneously, we are able to remove both shadows and indirect illumination.
Even in the presence of intense illumination, our method recovers high-quality albedo and roughness with no shadow interference.
- Score: 51.50157919750782
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Implicit neural representation has opened up new possibilities for inverse
rendering. However, existing implicit neural inverse rendering methods struggle
to handle strongly illuminated scenes with significant shadows and indirect
illumination. The existence of shadows and reflections can lead to an
inaccurate understanding of scene geometry, making precise factorization
difficult. To this end, we present SIRe-IR, an implicit neural inverse
rendering approach that uses non-linear mapping and regularized visibility
estimation to decompose the scene into environment map, albedo, and roughness.
By accurately modeling the indirect radiance field, normal, visibility, and
direct light simultaneously, we are able to remove both shadows and indirect
illumination in materials without imposing strict constraints on the scene.
Even in the presence of intense illumination, our method recovers high-quality
albedo and roughness with no shadow interference. SIRe-IR outperforms existing
methods in both quantitative and qualitative evaluations.
Related papers
- Photometric Inverse Rendering: Shading Cues Modeling and Surface Reflectance Regularization [46.146783750386994]
We propose a new method for neural inverse rendering.
Our method jointly optimize the light source position to account for the self-shadows in images.
To enhance surface reflectance decomposition, we introduce a new regularization.
arXiv Detail & Related papers (2024-08-13T11:39:14Z) - GaNI: Global and Near Field Illumination Aware Neural Inverse Rendering [21.584362527926654]
GaNI can reconstruct geometry, albedo, and roughness parameters from images of a scene captured with co-located light and camera.
Existing inverse rendering techniques with co-located light-camera focus on single objects only.
arXiv Detail & Related papers (2024-03-22T23:47:19Z) - SIR: Multi-view Inverse Rendering with Decomposable Shadow for Indoor Scenes [0.88756501225368]
We propose SIR, an efficient method to decompose differentiable shadows for inverse rendering on indoor scenes using multi-view data.
SIR explicitly learns shadows for enhanced realism in material estimation under unknown light positions.
The significant decomposing ability of SIR enables sophisticated editing capabilities like free-view relighting, object insertion, and material replacement.
arXiv Detail & Related papers (2024-02-09T01:48:44Z) - TensoIR: Tensorial Inverse Rendering [51.57268311847087]
TensoIR is a novel inverse rendering approach based on tensor factorization and neural fields.
TensoRF is a state-of-the-art approach for radiance field modeling.
arXiv Detail & Related papers (2023-04-24T21:39:13Z) - Neural Fields meet Explicit Geometric Representation for Inverse
Rendering of Urban Scenes [62.769186261245416]
We present a novel inverse rendering framework for large urban scenes capable of jointly reconstructing the scene geometry, spatially-varying materials, and HDR lighting from a set of posed RGB images with optional depth.
Specifically, we use a neural field to account for the primary rays, and use an explicit mesh (reconstructed from the underlying neural field) for modeling secondary rays that produce higher-order lighting effects such as cast shadows.
arXiv Detail & Related papers (2023-04-06T17:51:54Z) - NeFII: Inverse Rendering for Reflectance Decomposition with Near-Field
Indirect Illumination [48.42173911185454]
Inverse rendering methods aim to estimate geometry, materials and illumination from multi-view RGB images.
We propose an end-to-end inverse rendering pipeline that decomposes materials and illumination from multi-view images.
arXiv Detail & Related papers (2023-03-29T12:05:19Z) - Modeling Indirect Illumination for Inverse Rendering [31.734819333921642]
In this paper, we propose a novel approach to efficiently recovering spatially-varying indirect illumination.
The key insight is that indirect illumination can be conveniently derived from the neural radiance field learned from input images.
Experiments on both synthetic and real data demonstrate the superior performance of our approach compared to previous work.
arXiv Detail & Related papers (2022-04-14T09:10:55Z) - NeRFactor: Neural Factorization of Shape and Reflectance Under an
Unknown Illumination [60.89737319987051]
We address the problem of recovering shape and spatially-varying reflectance of an object from posed multi-view images of the object illuminated by one unknown lighting condition.
This enables the rendering of novel views of the object under arbitrary environment lighting and editing of the object's material properties.
arXiv Detail & Related papers (2021-06-03T16:18:01Z) - Towards High Fidelity Monocular Face Reconstruction with Rich
Reflectance using Self-supervised Learning and Ray Tracing [49.759478460828504]
Methods combining deep neural network encoders with differentiable rendering have opened up the path for very fast monocular reconstruction of geometry, lighting and reflectance.
ray tracing was introduced for monocular face reconstruction within a classic optimization-based framework.
We propose a new method that greatly improves reconstruction quality and robustness in general scenes.
arXiv Detail & Related papers (2021-03-29T08:58:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.