Manifold Sampling for Differentiable Uncertainty in Radiance Fields
- URL: http://arxiv.org/abs/2409.12661v1
- Date: Thu, 19 Sep 2024 11:22:20 GMT
- Title: Manifold Sampling for Differentiable Uncertainty in Radiance Fields
- Authors: Linjie Lyu, Ayush Tewari, Marc Habermann, Shunsuke Saito, Michael Zollhöfer, Thomas Leimkühler, Christian Theobalt,
- Abstract summary: We propose a versatile approach for learning Gaussian radiance fields with explicit and fine-grained uncertainty estimates.
We demonstrate state-of-the-art performance on next-best-view planning tasks.
- Score: 82.17927517146929
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Radiance fields are powerful and, hence, popular models for representing the appearance of complex scenes. Yet, constructing them based on image observations gives rise to ambiguities and uncertainties. We propose a versatile approach for learning Gaussian radiance fields with explicit and fine-grained uncertainty estimates that impose only little additional cost compared to uncertainty-agnostic training. Our key observation is that uncertainties can be modeled as a low-dimensional manifold in the space of radiance field parameters that is highly amenable to Monte Carlo sampling. Importantly, our uncertainties are differentiable and, thus, allow for gradient-based optimization of subsequent captures that optimally reduce ambiguities. We demonstrate state-of-the-art performance on next-best-view planning tasks, including high-dimensional illumination planning for optimal radiance field relighting quality.
Related papers
- ProvNeRF: Modeling per Point Provenance in NeRFs as a Stochastic Field [52.09661042881063]
We propose an approach that models the bfprovenance for each point -- i.e., the locations where it is likely visible -- of NeRFs as a text field.
We show that modeling per-point provenance during the NeRF optimization enriches the model with information on leading to improvements in novel view synthesis and uncertainty estimation.
arXiv Detail & Related papers (2024-01-16T06:19:18Z) - FisherRF: Active View Selection and Uncertainty Quantification for
Radiance Fields using Fisher Information [32.66184501415286]
This study addresses the problem of active view selection and uncertainty quantification within the domain of Radiance Fields.
NeRF have greatly advanced image rendering and reconstruction, but the limited availability of 2D images poses uncertainties.
By leveraging Fisher Information, we efficiently quantify observed information within Radiance Fields without ground truth data.
arXiv Detail & Related papers (2023-11-29T18:20:16Z) - Estimating 3D Uncertainty Field: Quantifying Uncertainty for Neural
Radiance Fields [25.300284510832974]
We propose a novel approach to estimate a 3D Uncertainty Field based on the learned incomplete scene geometry.
By considering the accumulated transmittance along each camera ray, our Uncertainty Field infers 2D pixel-wise uncertainty.
Our experiments demonstrate that our approach is the only one that can explicitly reason about high uncertainty both on 3D unseen regions and its involved 2D rendered pixels.
arXiv Detail & Related papers (2023-11-03T09:47:53Z) - SIRe-IR: Inverse Rendering for BRDF Reconstruction with Shadow and
Illumination Removal in High-Illuminance Scenes [51.50157919750782]
We present SIRe-IR, an implicit neural rendering inverse approach that decomposes the scene into environment map, albedo, and roughness.
By accurately modeling the indirect radiance field, normal, visibility, and direct light simultaneously, we are able to remove both shadows and indirect illumination.
Even in the presence of intense illumination, our method recovers high-quality albedo and roughness with no shadow interference.
arXiv Detail & Related papers (2023-10-19T10:44:23Z) - FG-NeRF: Flow-GAN based Probabilistic Neural Radiance Field for
Independence-Assumption-Free Uncertainty Estimation [28.899779240902703]
We propose an independence-assumption-free probabilistic neural radiance field based on Flow-GAN.
By combining the generative capability of adversarial learning and the powerful expressivity of normalizing flow, our method explicitly models the density-radiance distribution of the scene.
Our method demonstrates state-of-the-art performance by predicting lower rendering errors and more reliable uncertainty on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-09-28T12:05:08Z) - TensoIR: Tensorial Inverse Rendering [51.57268311847087]
TensoIR is a novel inverse rendering approach based on tensor factorization and neural fields.
TensoRF is a state-of-the-art approach for radiance field modeling.
arXiv Detail & Related papers (2023-04-24T21:39:13Z) - Improving Generalization via Uncertainty Driven Perturbations [107.45752065285821]
We consider uncertainty-driven perturbations of the training data points.
Unlike loss-driven perturbations, uncertainty-guided perturbations do not cross the decision boundary.
We show that UDP is guaranteed to achieve the robustness margin decision on linear models.
arXiv Detail & Related papers (2022-02-11T16:22:08Z) - Improving black-box optimization in VAE latent space using decoder
uncertainty [25.15359244726929]
We introduce an importance sampling-based estimator that provides more robust estimates of uncertainty.
It produces samples with a better trade-off between black-box objective and validity of the generated samples, sometimes improving both simultaneously.
We illustrate these advantages across several experimental settings in digit generation, arithmetic expression approximation and molecule generation for drug design.
arXiv Detail & Related papers (2021-06-30T20:46:18Z) - Sparse Needlets for Lighting Estimation with Spherical Transport Loss [89.52531416604774]
NeedleLight is a new lighting estimation model that represents illumination with needlets and allows lighting estimation in both frequency domain and spatial domain jointly.
Extensive experiments show that NeedleLight achieves superior lighting estimation consistently across multiple evaluation metrics as compared with state-of-the-art methods.
arXiv Detail & Related papers (2021-06-24T15:19:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.