Neural Control Variates with Automatic Integration
- URL: http://arxiv.org/abs/2409.15394v1
- Date: Mon, 23 Sep 2024 06:04:28 GMT
- Title: Neural Control Variates with Automatic Integration
- Authors: Zilu Li, Guandao Yang, Qingqing Zhao, Xi Deng, Leonidas Guibas, Bharath Hariharan, Gordon Wetzstein,
- Abstract summary: This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
- Score: 49.91408797261987
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a method to leverage arbitrary neural network architecture for control variates. Control variates are crucial in reducing the variance of Monte Carlo integration, but they hinge on finding a function that both correlates with the integrand and has a known analytical integral. Traditional approaches rely on heuristics to choose this function, which might not be expressive enough to correlate well with the integrand. Recent research alleviates this issue by modeling the integrands with a learnable parametric model, such as a neural network. However, the challenge remains in creating an expressive parametric model with a known analytical integral. This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures. Instead of using a network to approximate the integrand directly, we employ the network to approximate the anti-derivative of the integrand. This allows us to use automatic differentiation to create a function whose integration can be constructed by the antiderivative network. We apply our method to solve partial differential equations using the Walk-on-sphere algorithm. Our results indicate that this approach is unbiased and uses various network architectures to achieve lower variance than other control variate methods.
Related papers
- Fixed Integral Neural Networks [2.2118683064997273]
We present a method for representing the analytical integral of a learned function $f$.
This allows the exact integral of a neural network to be computed, and enables constrained neural networks to be parametrised.
We also introduce a method to constrain $f$ to be positive, a necessary condition for many applications.
arXiv Detail & Related papers (2023-07-26T18:16:43Z) - Integral Transforms in a Physics-Informed (Quantum) Neural Network
setting: Applications & Use-Cases [1.7403133838762446]
In many computational problems in engineering and science, function or model differentiation is essential, but also integration is needed.
In this work, we propose to augment the paradigm of Physics-Informed Neural Networks with automatic integration.
arXiv Detail & Related papers (2022-06-28T17:51:32Z) - NeuralEF: Deconstructing Kernels by Deep Neural Networks [47.54733625351363]
Traditional nonparametric solutions based on the Nystr"om formula suffer from scalability issues.
Recent work has resorted to a parametric approach, i.e., training neural networks to approximate the eigenfunctions.
We show that these problems can be fixed by using a new series of objective functions that generalizes to space of supervised and unsupervised learning problems.
arXiv Detail & Related papers (2022-04-30T05:31:07Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Polynomial-Spline Neural Networks with Exact Integrals [0.0]
We develop a novel neural network architecture that combines a mixture-of-experts model with free knot B1-spline basis functions.
Our architecture exhibits both $h$- and $p$- refinement for regression problems at the convergence rates expected from approximation theory.
We demonstrate the success of our network on a range of regression and variational problems that illustrate the consistency and exact integrability of our network architecture.
arXiv Detail & Related papers (2021-10-26T22:12:37Z) - AutoInt: Automatic Integration for Fast Neural Volume Rendering [51.46232518888791]
We propose a new framework for learning efficient, closed-form solutions to integrals using implicit neural representation networks.
We demonstrate a greater than 10x improvement in photorealistic requirements, enabling fast neural volume rendering.
arXiv Detail & Related papers (2020-12-03T05:46:10Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Neural Control Variates [71.42768823631918]
We show that a set of neural networks can face the challenge of finding a good approximation of the integrand.
We derive a theoretically optimal, variance-minimizing loss function, and propose an alternative, composite loss for stable online training in practice.
Specifically, we show that the learned light-field approximation is of sufficient quality for high-order bounces, allowing us to omit the error correction and thereby dramatically reduce the noise at the cost of negligible visible bias.
arXiv Detail & Related papers (2020-06-02T11:17:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.