Modeling Indirect Illumination for Inverse Rendering
- URL: http://arxiv.org/abs/2204.06837v1
- Date: Thu, 14 Apr 2022 09:10:55 GMT
- Title: Modeling Indirect Illumination for Inverse Rendering
- Authors: Yuanqing Zhang, Jiaming Sun, Xingyi He, Huan Fu, Rongfei Jia, Xiaowei
Zhou
- Abstract summary: In this paper, we propose a novel approach to efficiently recovering spatially-varying indirect illumination.
The key insight is that indirect illumination can be conveniently derived from the neural radiance field learned from input images.
Experiments on both synthetic and real data demonstrate the superior performance of our approach compared to previous work.
- Score: 31.734819333921642
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances in implicit neural representations and differentiable
rendering make it possible to simultaneously recover the geometry and materials
of an object from multi-view RGB images captured under unknown static
illumination. Despite the promising results achieved, indirect illumination is
rarely modeled in previous methods, as it requires expensive recursive path
tracing which makes the inverse rendering computationally intractable. In this
paper, we propose a novel approach to efficiently recovering spatially-varying
indirect illumination. The key insight is that indirect illumination can be
conveniently derived from the neural radiance field learned from input images
instead of being estimated jointly with direct illumination and materials. By
properly modeling the indirect illumination and visibility of direct
illumination, interreflection- and shadow-free albedo can be recovered. The
experiments on both synthetic and real data demonstrate the superior
performance of our approach compared to previous work and its capability to
synthesize realistic renderings under novel viewpoints and illumination. Our
code and data are available at https://zju3dv.github.io/invrender/.
Related papers
- Baking Relightable NeRF for Real-time Direct/Indirect Illumination Rendering [4.812321790984493]
Real-time relighting is challenging due to high computation cost of the rendering equation.
We propose a novel method that executes a CNN to compute primary surface points and rendering parameters.
Both distillations are trained from a pre-trained teacher model and provide real-time physically-based rendering under unseen lighting condition.
arXiv Detail & Related papers (2024-09-16T14:38:26Z) - SIRe-IR: Inverse Rendering for BRDF Reconstruction with Shadow and
Illumination Removal in High-Illuminance Scenes [51.50157919750782]
We present SIRe-IR, an implicit neural rendering inverse approach that decomposes the scene into environment map, albedo, and roughness.
By accurately modeling the indirect radiance field, normal, visibility, and direct light simultaneously, we are able to remove both shadows and indirect illumination.
Even in the presence of intense illumination, our method recovers high-quality albedo and roughness with no shadow interference.
arXiv Detail & Related papers (2023-10-19T10:44:23Z) - Diffusion Posterior Illumination for Ambiguity-aware Inverse Rendering [63.24476194987721]
Inverse rendering, the process of inferring scene properties from images, is a challenging inverse problem.
Most existing solutions incorporate priors into the inverse-rendering pipeline to encourage plausible solutions.
We propose a novel scheme that integrates a denoising probabilistic diffusion model pre-trained on natural illumination maps into an optimization framework.
arXiv Detail & Related papers (2023-09-30T12:39:28Z) - TensoIR: Tensorial Inverse Rendering [51.57268311847087]
TensoIR is a novel inverse rendering approach based on tensor factorization and neural fields.
TensoRF is a state-of-the-art approach for radiance field modeling.
arXiv Detail & Related papers (2023-04-24T21:39:13Z) - NeAI: A Pre-convoluted Representation for Plug-and-Play Neural Ambient
Illumination [28.433403714053103]
We propose a framework named neural ambient illumination (NeAI)
NeAI uses Neural Radiance Fields (NeRF) as a lighting model to handle complex lighting in a physically based way.
Experiments demonstrate the superior performance of novel-view rendering compared to previous works.
arXiv Detail & Related papers (2023-04-18T06:32:30Z) - NeFII: Inverse Rendering for Reflectance Decomposition with Near-Field
Indirect Illumination [48.42173911185454]
Inverse rendering methods aim to estimate geometry, materials and illumination from multi-view RGB images.
We propose an end-to-end inverse rendering pipeline that decomposes materials and illumination from multi-view images.
arXiv Detail & Related papers (2023-03-29T12:05:19Z) - Physics-based Indirect Illumination for Inverse Rendering [70.27534648770057]
We present a physics-based inverse rendering method that learns the illumination, geometry, and materials of a scene from posed multi-view RGB images.
As a side product, our physics-based inverse rendering model also facilitates flexible and realistic material editing as well as relighting.
arXiv Detail & Related papers (2022-12-09T07:33:49Z) - DIB-R++: Learning to Predict Lighting and Material with a Hybrid
Differentiable Renderer [78.91753256634453]
We consider the challenging problem of predicting intrinsic object properties from a single image by exploiting differentiables.
In this work, we propose DIBR++, a hybrid differentiable which supports these effects by combining specularization and ray-tracing.
Compared to more advanced physics-based differentiables, DIBR++ is highly performant due to its compact and expressive model.
arXiv Detail & Related papers (2021-10-30T01:59:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.