Learning Neural Transmittance for Efficient Rendering of Reflectance
Fields
- URL: http://arxiv.org/abs/2110.13272v1
- Date: Mon, 25 Oct 2021 21:12:25 GMT
- Title: Learning Neural Transmittance for Efficient Rendering of Reflectance
Fields
- Authors: Mohammad Shafiei, Sai Bi, Zhengqin Li, Aidas Liaudanskas, Rodrigo
Ortiz-Cayon, Ravi Ramamoorthi
- Abstract summary: We propose a novel method based on precomputed Neural Transmittance Functions to accelerate rendering of neural reflectance fields.
Results on real and synthetic scenes demonstrate almost two order of magnitude speedup for renderings under environment maps with minimal accuracy loss.
- Score: 43.24427791156121
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently neural volumetric representations such as neural reflectance fields
have been widely applied to faithfully reproduce the appearance of real-world
objects and scenes under novel viewpoints and lighting conditions. However, it
remains challenging and time-consuming to render such representations under
complex lighting such as environment maps, which requires individual ray
marching towards each single light to calculate the transmittance at every
sampled point. In this paper, we propose a novel method based on precomputed
Neural Transmittance Functions to accelerate the rendering of neural
reflectance fields. Our neural transmittance functions enable us to efficiently
query the transmittance at an arbitrary point in space along an arbitrary ray
without tedious ray marching, which effectively reduces the time-complexity of
the rendering. We propose a novel formulation for the neural transmittance
function, and train it jointly with the neural reflectance fields on images
captured under collocated camera and light, while enforcing monotonicity.
Results on real and synthetic scenes demonstrate almost two order of magnitude
speedup for renderings under environment maps with minimal accuracy loss.
Related papers
- NeRF-Casting: Improved View-Dependent Appearance with Consistent Reflections [57.63028964831785]
Recent works have improved NeRF's ability to render detailed specular appearance of distant environment illumination, but are unable to synthesize consistent reflections of closer content.
We address these issues with an approach based on ray tracing.
Instead of querying an expensive neural network for the outgoing view-dependent radiance at points along each camera ray, our model casts rays from these points and traces them through the NeRF representation to render feature vectors.
arXiv Detail & Related papers (2024-05-23T17:59:57Z) - Flying with Photons: Rendering Novel Views of Propagating Light [37.06220870989172]
We present an imaging and neural rendering technique that seeks to synthesize videos of light propagating through a scene from novel, moving camera viewpoints.
Our approach relies on a new ultrafast imaging setup to capture a first-of-its kind, multi-viewpoint video dataset with pico-second-level temporal resolution.
arXiv Detail & Related papers (2024-04-09T17:48:52Z) - Adaptive Shells for Efficient Neural Radiance Field Rendering [92.18962730460842]
We propose a neural radiance formulation that smoothly transitions between- and surface-based rendering.
Our approach enables efficient rendering at very high fidelity.
We also demonstrate that the extracted envelope enables downstream applications such as animation and simulation.
arXiv Detail & Related papers (2023-11-16T18:58:55Z) - TraM-NeRF: Tracing Mirror and Near-Perfect Specular Reflections through
Neural Radiance Fields [3.061835990893184]
Implicit representations like Neural Radiance Fields (NeRF) showed impressive results for rendering of complex scenes with fine details.
We present a novel reflection tracing method tailored for the involved volume rendering within NeRF.
We derive efficient strategies for importance sampling and the transmittance computation along rays from only few samples.
arXiv Detail & Related papers (2023-10-16T17:59:56Z) - Neural Point Light Fields [80.98651520818785]
We introduce Neural Point Light Fields that represent scenes implicitly with a light field living on a sparse point cloud.
These point light fields are as a function of the ray direction, and local point feature neighborhood, allowing us to interpolate the light field conditioned training images without dense object coverage and parallax.
arXiv Detail & Related papers (2021-12-02T18:20:10Z) - MVSNeRF: Fast Generalizable Radiance Field Reconstruction from
Multi-View Stereo [52.329580781898116]
We present MVSNeRF, a novel neural rendering approach that can efficiently reconstruct neural radiance fields for view synthesis.
Unlike prior works on neural radiance fields that consider per-scene optimization on densely captured images, we propose a generic deep neural network that can reconstruct radiance fields from only three nearby input views via fast network inference.
arXiv Detail & Related papers (2021-03-29T13:15:23Z) - Neural Reflectance Fields for Appearance Acquisition [61.542001266380375]
We present Neural Reflectance Fields, a novel deep scene representation that encodes volume density, normal and reflectance properties at any 3D point in a scene.
We combine this representation with a physically-based differentiable ray marching framework that can render images from a neural reflectance field under any viewpoint and light.
arXiv Detail & Related papers (2020-08-09T22:04:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.