Predicting Surface Reflectance Properties of Outdoor Scenes Under
Unknown Natural Illumination
- URL: http://arxiv.org/abs/2105.06820v1
- Date: Fri, 14 May 2021 13:31:47 GMT
- Title: Predicting Surface Reflectance Properties of Outdoor Scenes Under
Unknown Natural Illumination
- Authors: Farhan Rahman Wasee, Alen Joy, Charalambos Poullis
- Abstract summary: This paper proposes a complete framework to predict surface reflectance properties of outdoor scenes under unknown natural illumination.
We recast the problem into its two constituent components involving the BRDF incoming light and outgoing view directions.
We present experiments that show that rendering with the predicted reflectance properties results in a visually similar appearance to using textures.
- Score: 6.767885381740952
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Estimating and modelling the appearance of an object under outdoor
illumination conditions is a complex process. Although there have been several
studies on illumination estimation and relighting, very few of them focus on
estimating the reflectance properties of outdoor objects and scenes. This paper
addresses this problem and proposes a complete framework to predict surface
reflectance properties of outdoor scenes under unknown natural illumination.
Uniquely, we recast the problem into its two constituent components involving
the BRDF incoming light and outgoing view directions: (i) surface points'
radiance captured in the images, and outgoing view directions are aggregated
and encoded into reflectance maps, and (ii) a neural network trained on
reflectance maps of renders of a unit sphere under arbitrary light directions
infers a low-parameter reflection model representing the reflectance properties
at each surface in the scene. Our model is based on a combination of
phenomenological and physics-based scattering models and can relight the scenes
from novel viewpoints. We present experiments that show that rendering with the
predicted reflectance properties results in a visually similar appearance to
using textures that cannot otherwise be disentangled from the reflectance
properties.
Related papers
- Planar Reflection-Aware Neural Radiance Fields [32.709468082010126]
We introduce a reflection-aware NeRF that jointly models planar reflectors, such as windows, and explicitly casts reflected rays to capture the source of the high-frequency reflections.
Rendering along the primary ray results in a clean, reflection-free view, while explicitly rendering along the reflected ray allows us to reconstruct highly detailed reflections.
arXiv Detail & Related papers (2024-11-07T18:55:08Z) - NeRSP: Neural 3D Reconstruction for Reflective Objects with Sparse Polarized Images [62.752710734332894]
NeRSP is a Neural 3D reconstruction technique for Reflective surfaces with Sparse Polarized images.
We derive photometric and geometric cues from the polarimetric image formation model and multiview azimuth consistency.
We achieve the state-of-the-art surface reconstruction results with only 6 views as input.
arXiv Detail & Related papers (2024-06-11T09:53:18Z) - DeepShaRM: Multi-View Shape and Reflectance Map Recovery Under Unknown
Lighting [35.18426818323455]
We derive a novel multi-view method, DeepShaRM, that achieves state-of-the-art accuracy on this challenging task.
We introduce a novel deep reflectance map estimation network that recovers the camera-view reflectance maps.
A deep shape-from-shading network then updates the geometry estimate expressed with a signed distance function.
arXiv Detail & Related papers (2023-10-26T17:50:10Z) - Factored-NeuS: Reconstructing Surfaces, Illumination, and Materials of
Possibly Glossy Objects [46.04357263321969]
We develop a method that recovers the surface, materials, and illumination of a scene from its posed multi-view images.
It does not require any additional data and can handle glossy objects or bright lighting.
arXiv Detail & Related papers (2023-05-29T07:44:19Z) - Ref-NeuS: Ambiguity-Reduced Neural Implicit Surface Learning for
Multi-View Reconstruction with Reflection [24.23826907954389]
Ref-NeuS aims to reduce ambiguity by attenuating the effect of reflective surfaces.
We show that our model achieves high-quality surface reconstruction on reflective surfaces and outperforms the state-of-the-arts by a large margin.
arXiv Detail & Related papers (2023-03-20T03:08:22Z) - Physics-based Indirect Illumination for Inverse Rendering [70.27534648770057]
We present a physics-based inverse rendering method that learns the illumination, geometry, and materials of a scene from posed multi-view RGB images.
As a side product, our physics-based inverse rendering model also facilitates flexible and realistic material editing as well as relighting.
arXiv Detail & Related papers (2022-12-09T07:33:49Z) - NeRFactor: Neural Factorization of Shape and Reflectance Under an
Unknown Illumination [60.89737319987051]
We address the problem of recovering shape and spatially-varying reflectance of an object from posed multi-view images of the object illuminated by one unknown lighting condition.
This enables the rendering of novel views of the object under arbitrary environment lighting and editing of the object's material properties.
arXiv Detail & Related papers (2021-06-03T16:18:01Z) - Monocular Reconstruction of Neural Face Reflectance Fields [0.0]
The reflectance field of a face describes the reflectance properties responsible for complex lighting effects.
Most existing methods for estimating the face reflectance from a monocular image assume faces to be diffuse with very few approaches adding a specular component.
We present a new neural representation for face reflectance where we can estimate all components of the reflectance responsible for the final appearance from a single monocular image.
arXiv Detail & Related papers (2020-08-24T08:19:05Z) - Neural Reflectance Fields for Appearance Acquisition [61.542001266380375]
We present Neural Reflectance Fields, a novel deep scene representation that encodes volume density, normal and reflectance properties at any 3D point in a scene.
We combine this representation with a physically-based differentiable ray marching framework that can render images from a neural reflectance field under any viewpoint and light.
arXiv Detail & Related papers (2020-08-09T22:04:36Z) - Polarized Reflection Removal with Perfect Alignment in the Wild [66.48211204364142]
We present a novel formulation to removing reflection from polarized images in the wild.
We first identify the misalignment issues of existing reflection removal datasets.
We build a new dataset with more than 100 types of glass in which obtained transmission images are perfectly aligned with input mixed images.
arXiv Detail & Related papers (2020-03-28T13:29:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.