Differentiable Neural Radiosity
- URL: http://arxiv.org/abs/2201.13190v1
- Date: Mon, 31 Jan 2022 12:53:37 GMT
- Title: Differentiable Neural Radiosity
- Authors: Saeed Hadadan, Matthias Zwicker
- Abstract summary: We introduce Differentiable Neural Radiosity, a novel method of representing the solution of the differential rendering equation using a neural network.
Inspired by neural radiosity techniques, we minimize the norm of the residual of the differential rendering equation to directly optimize our network.
- Score: 28.72382947011186
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Differentiable Neural Radiosity, a novel method of representing
the solution of the differential rendering equation using a neural network.
Inspired by neural radiosity techniques, we minimize the norm of the residual
of the differential rendering equation to directly optimize our network. The
network is capable of outputting continuous, view-independent gradients of the
radiance field with respect to scene parameters, taking into account
differential global illumination effects while keeping memory and time
complexity constant in path length. To solve inverse rendering problems, we use
a pre-trained instance of our network that represents the differential radiance
field with respect to a limited number of scene parameters. In our experiments,
we leverage this to achieve faster and more accurate convergence compared to
other techniques such as Automatic Differentiation, Radiative Backpropagation,
and Path Replay Backpropagation.
Related papers
- Fast and Accurate Neural Rendering Using Semi-Gradients [2.977255700811213]
We propose a neural network-based framework for global illumination rendering.
We identify the cause of these issues as the bias and high variance present in the gradient estimates of the residual-based objective function.
arXiv Detail & Related papers (2024-10-14T04:30:38Z) - NeRF-Casting: Improved View-Dependent Appearance with Consistent Reflections [57.63028964831785]
Recent works have improved NeRF's ability to render detailed specular appearance of distant environment illumination, but are unable to synthesize consistent reflections of closer content.
We address these issues with an approach based on ray tracing.
Instead of querying an expensive neural network for the outgoing view-dependent radiance at points along each camera ray, our model casts rays from these points and traces them through the NeRF representation to render feature vectors.
arXiv Detail & Related papers (2024-05-23T17:59:57Z) - Rethinking Directional Integration in Neural Radiance Fields [8.012147983948665]
We introduce a modification to the NeRF rendering equation which is as simple as a few lines of code change for any NeRF variations.
We show that the modified equation can be interpreted as light field rendering with learned ray embeddings.
arXiv Detail & Related papers (2023-11-28T18:59:50Z) - Inverse Global Illumination using a Neural Radiometric Prior [26.29610954064107]
Inverse rendering methods that account for global illumination are becoming more popular.
This paper proposes a radiometric prior as a simple alternative to building complete path integrals in a traditional differentiable path tracer.
arXiv Detail & Related papers (2023-05-03T15:36:39Z) - IntrinsicNeRF: Learning Intrinsic Neural Radiance Fields for Editable
Novel View Synthesis [90.03590032170169]
We present intrinsic neural radiance fields, dubbed IntrinsicNeRF, which introduce intrinsic decomposition into the NeRF-based neural rendering method.
Our experiments and editing samples on both object-specific/room-scale scenes and synthetic/real-word data demonstrate that we can obtain consistent intrinsic decomposition results.
arXiv Detail & Related papers (2022-10-02T22:45:11Z) - InfoNeRF: Ray Entropy Minimization for Few-Shot Neural Volume Rendering [55.70938412352287]
We present an information-theoretic regularization technique for few-shot novel view synthesis based on neural implicit representation.
The proposed approach minimizes potential reconstruction inconsistency that happens due to insufficient viewpoints.
We achieve consistently improved performance compared to existing neural view synthesis methods by large margins on multiple standard benchmarks.
arXiv Detail & Related papers (2021-12-31T11:56:01Z) - NerfingMVS: Guided Optimization of Neural Radiance Fields for Indoor
Multi-view Stereo [97.07453889070574]
We present a new multi-view depth estimation method that utilizes both conventional SfM reconstruction and learning-based priors.
We show that our proposed framework significantly outperforms state-of-the-art methods on indoor scenes.
arXiv Detail & Related papers (2021-09-02T17:54:31Z) - Neural Radiosity [31.35525999999182]
We introduce Neural Radiosity, an algorithm to solve the equation by minimizing the norm of its residual equation.
Our approach decouples solving the radiance equation from rendering (perspective) images, and allows us to efficiently synthesize arbitrary views of a scene.
arXiv Detail & Related papers (2021-05-26T04:10:00Z) - MVSNeRF: Fast Generalizable Radiance Field Reconstruction from
Multi-View Stereo [52.329580781898116]
We present MVSNeRF, a novel neural rendering approach that can efficiently reconstruct neural radiance fields for view synthesis.
Unlike prior works on neural radiance fields that consider per-scene optimization on densely captured images, we propose a generic deep neural network that can reconstruct radiance fields from only three nearby input views via fast network inference.
arXiv Detail & Related papers (2021-03-29T13:15:23Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.