TensoFlow: Tensorial Flow-based Sampler for Inverse Rendering
- URL: http://arxiv.org/abs/2503.18328v1
- Date: Mon, 24 Mar 2025 04:09:46 GMT
- Title: TensoFlow: Tensorial Flow-based Sampler for Inverse Rendering
- Authors: Chun Gu, Xiaofei Wei, Li Zhang, Xiatian Zhu,
- Abstract summary: Inverse rendering aims to recover scene geometry, material properties, and lighting from multi-view images.<n>Given the complexity of light-surface interactions, importance sampling is essential for the evaluation of the rendering equation.<n>Existing inverse rendering methods typically use pre-defined non-learnable importance samplers in prior manually.<n>We propose the concept of learning a spatially and directionally aware importance sampler for the rendering equation to accurately and flexibly capture the unconstrained complexity of a typical scene.
- Score: 38.74244725059936
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Inverse rendering aims to recover scene geometry, material properties, and lighting from multi-view images. Given the complexity of light-surface interactions, importance sampling is essential for the evaluation of the rendering equation, as it reduces variance and enhances the efficiency of Monte Carlo sampling. Existing inverse rendering methods typically use pre-defined non-learnable importance samplers in prior manually, struggling to effectively match the spatially and directionally varied integrand and resulting in high variance and suboptimal performance. To address this limitation, we propose the concept of learning a spatially and directionally aware importance sampler for the rendering equation to accurately and flexibly capture the unconstrained complexity of a typical scene. We further formulate TensoFlow, a generic approach for sampler learning in inverse rendering, enabling to closely match the integrand of the rendering equation spatially and directionally. Concretely, our sampler is parameterized by normalizing flows, allowing both directional sampling of incident light and probability density function (PDF) inference. To capture the characteristics of the sampler spatially, we learn a tensorial representation of the scene space, which imposes spatial conditions, together with reflected direction, leading to spatially and directionally aware sampling distributions. Our model can be optimized by minimizing the difference between the integrand and our normalizing flow. Extensive experiments validate the superiority of TensoFlow over prior alternatives on both synthetic and real-world benchmarks.
Related papers
- Diffusing Differentiable Representations [60.72992910766525]
We introduce a novel, training-free method for sampling differentiable representations (diffreps) using pretrained diffusion models.
We identify an implicit constraint on the samples induced by the diffrep and demonstrate that addressing this constraint significantly improves the consistency and detail of the generated objects.
arXiv Detail & Related papers (2024-12-09T20:42:58Z) - Neural Product Importance Sampling via Warp Composition [9.846719854600709]
We present a learning-based method that uses normalizing flows to efficiently importance sample illumination product integrals.
We demonstrate variance reduction over prior methods on a range of applications comprising complex geometry, materials and illumination.
arXiv Detail & Related papers (2024-09-12T15:38:21Z) - Entropy-MCMC: Sampling from Flat Basins with Ease [10.764160559530849]
We introduce an auxiliary guiding variable, the stationary distribution of which resembles a smoothed posterior free from sharp modes, to lead the MCMC sampler to flat basins.
By integrating this guiding variable with the model parameter, we create a simple joint distribution that enables efficient sampling with minimal computational overhead.
Empirical results demonstrate that our method can successfully sample from flat basins of the posterior, and outperforms all compared baselines on multiple benchmarks.
arXiv Detail & Related papers (2023-10-09T04:40:20Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Sensing Cox Processes via Posterior Sampling and Positive Bases [56.82162768921196]
We study adaptive sensing of point processes, a widely used model from spatial statistics.
We model the intensity function as a sample from a truncated Gaussian process, represented in a specially constructed positive basis.
Our adaptive sensing algorithms use Langevin dynamics and are based on posterior sampling (textscCox-Thompson) and top-two posterior sampling (textscTop2) principles.
arXiv Detail & Related papers (2021-10-21T14:47:06Z) - Neural UpFlow: A Scene Flow Learning Approach to Increase the Apparent
Resolution of Particle-Based Liquids [0.6882042556551611]
We present a novel up-resing technique for generating high-resolution liquids based on scene flow estimation using deep neural networks.
Our approach infers and synthesizes small- and large-scale details solely from a low-resolution particle-based liquid simulation.
arXiv Detail & Related papers (2021-06-09T15:36:23Z) - Neural BRDF Representation and Importance Sampling [79.84316447473873]
We present a compact neural network-based representation of reflectance BRDF data.
We encode BRDFs as lightweight networks, and propose a training scheme with adaptive angular sampling.
We evaluate encoding results on isotropic and anisotropic BRDFs from multiple real-world datasets.
arXiv Detail & Related papers (2021-02-11T12:00:24Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z) - Learning to Importance Sample in Primary Sample Space [22.98252856114423]
We propose a novel importance sampling technique that uses a neural network to learn how to sample from a desired density represented by a set of samples.
We show that our approach leads to effective variance reduction in several practical scenarios.
arXiv Detail & Related papers (2018-08-23T16:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.