Factored-NeuS: Reconstructing Surfaces, Illumination, and Materials of
Possibly Glossy Objects
- URL: http://arxiv.org/abs/2305.17929v1
- Date: Mon, 29 May 2023 07:44:19 GMT
- Title: Factored-NeuS: Reconstructing Surfaces, Illumination, and Materials of
Possibly Glossy Objects
- Authors: Yue Fan, Ivan Skorokhodov, Oleg Voynov, Savva Ignatyev, Evgeny
Burnaev, Peter Wonka, Yiqun Wang
- Abstract summary: We develop a method that recovers the surface, materials, and illumination of a scene from its posed multi-view images.
It does not require any additional data and can handle glossy objects or bright lighting.
- Score: 46.04357263321969
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop a method that recovers the surface, materials, and illumination of
a scene from its posed multi-view images. In contrast to prior work, it does
not require any additional data and can handle glossy objects or bright
lighting. It is a progressive inverse rendering approach, which consists of
three stages. First, we reconstruct the scene radiance and signed distance
function (SDF) with our novel regularization strategy for specular reflections.
Our approach considers both the diffuse and specular colors, which allows for
handling complex view-dependent lighting effects for surface reconstruction.
Second, we distill light visibility and indirect illumination from the learned
SDF and radiance field using learnable mapping functions. Third, we design a
method for estimating the ratio of incoming direct light represented via
Spherical Gaussians reflected in a specular manner and then reconstruct the
materials and direct illumination of the scene. Experimental results
demonstrate that the proposed method outperforms the current state-of-the-art
in recovering surfaces, materials, and lighting without relying on any
additional data.
Related papers
- RISE-SDF: a Relightable Information-Shared Signed Distance Field for Glossy Object Inverse Rendering [26.988572852463815]
In this paper, we propose a novel end-to-end relightable neural inverse rendering system.
Our algorithm achieves state-of-the-art performance in inverse rendering and relighting.
Our experiments demonstrate that our algorithm achieves state-of-the-art performance in inverse rendering and relighting.
arXiv Detail & Related papers (2024-09-30T09:42:10Z) - Neural Relighting with Subsurface Scattering by Learning the Radiance
Transfer Gradient [73.52585139592398]
We propose a novel framework for learning the radiance transfer field via volume rendering.
We will release our code and a novel light stage dataset of objects with subsurface scattering effects publicly available.
arXiv Detail & Related papers (2023-06-15T17:56:04Z) - NeILF++: Inter-Reflectable Light Fields for Geometry and Material
Estimation [36.09503501647977]
We formulate the lighting of a static scene as one neural incident light field (NeILF) and one outgoing neural radiance field (NeRF)
The proposed method is able to achieve state-of-the-art results in terms of geometry reconstruction quality, material estimation accuracy, and the fidelity of novel view rendering.
arXiv Detail & Related papers (2023-03-30T04:59:48Z) - NeFII: Inverse Rendering for Reflectance Decomposition with Near-Field
Indirect Illumination [48.42173911185454]
Inverse rendering methods aim to estimate geometry, materials and illumination from multi-view RGB images.
We propose an end-to-end inverse rendering pipeline that decomposes materials and illumination from multi-view images.
arXiv Detail & Related papers (2023-03-29T12:05:19Z) - Physics-based Indirect Illumination for Inverse Rendering [70.27534648770057]
We present a physics-based inverse rendering method that learns the illumination, geometry, and materials of a scene from posed multi-view RGB images.
As a side product, our physics-based inverse rendering model also facilitates flexible and realistic material editing as well as relighting.
arXiv Detail & Related papers (2022-12-09T07:33:49Z) - Physically-Based Editing of Indoor Scene Lighting from a Single Image [106.60252793395104]
We present a method to edit complex indoor lighting from a single image with its predicted depth and light source segmentation masks.
We tackle this problem using two novel components: 1) a holistic scene reconstruction method that estimates scene reflectance and parametric 3D lighting, and 2) a neural rendering framework that re-renders the scene from our predictions.
arXiv Detail & Related papers (2022-05-19T06:44:37Z) - NeILF: Neural Incident Light Field for Physically-based Material
Estimation [31.230609753253713]
We present a differentiable rendering framework for material and lighting estimation from multi-view images and a reconstructed geometry.
In the framework, we represent scene lightings as the Neural Incident Light Field (NeILF) and material properties as the surface BRDF modelled by multi-layer perceptrons.
arXiv Detail & Related papers (2022-03-14T15:23:04Z) - NeRFactor: Neural Factorization of Shape and Reflectance Under an
Unknown Illumination [60.89737319987051]
We address the problem of recovering shape and spatially-varying reflectance of an object from posed multi-view images of the object illuminated by one unknown lighting condition.
This enables the rendering of novel views of the object under arbitrary environment lighting and editing of the object's material properties.
arXiv Detail & Related papers (2021-06-03T16:18:01Z) - NeRV: Neural Reflectance and Visibility Fields for Relighting and View
Synthesis [45.71507069571216]
We present a method that takes as input a set of images of a scene illuminated by unconstrained known lighting.
This produces a 3D representation that can be rendered from novel viewpoints under arbitrary lighting conditions.
arXiv Detail & Related papers (2020-12-07T18:56:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.