A blob method method for inhomogeneous diffusion with applications to
multi-agent control and sampling
- URL: http://arxiv.org/abs/2202.12927v1
- Date: Fri, 25 Feb 2022 19:49:05 GMT
- Title: A blob method method for inhomogeneous diffusion with applications to
multi-agent control and sampling
- Authors: Katy Craig, Karthik Elamvazhuthi, Matt Haberland, Olga Turanova
- Abstract summary: We develop a deterministic particle method for the weighted porous medium equation (WPME) and prove its convergence on bounded time intervals.
Our method has natural applications to multi-agent coverage algorithms and sampling probability measures.
- Score: 0.6562256987706128
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As a counterpoint to classical stochastic particle methods for linear
diffusion equations, we develop a deterministic particle method for the
weighted porous medium equation (WPME) and prove its convergence on bounded
time intervals. This generalizes related work on blob methods for unweighted
porous medium equations. From a numerical analysis perspective, our method has
several advantages: it is meshfree, preserves the gradient flow structure of
the underlying PDE, converges in arbitrary dimension, and captures the correct
asymptotic behavior in simulations.
That our method succeeds in capturing the long time behavior of WPME is
significant from the perspective of related problems in quantization. Just as
the Fokker-Planck equation provides a way to quantize a probability measure
$\bar{\rho}$ by evolving an empirical measure according to stochastic Langevin
dynamics so that the empirical measure flows toward $\bar{\rho}$, our particle
method provides a way to quantize $\bar{\rho}$ according to deterministic
particle dynamics approximating WMPE. In this way, our method has natural
applications to multi-agent coverage algorithms and sampling probability
measures.
A specific case of our method corresponds exactly to the mean-field dynamics
of training a two-layer neural network for a radial basis function activation
function. From this perspective, our convergence result shows that, in the over
parametrized regime and as the variance of the radial basis functions goes to
zero, the continuum limit is given by WPME. This generalizes previous results,
which considered the case of a uniform data distribution, to the more general
inhomogeneous setting. As a consequence of our convergence result, we identify
conditions on the target function and data distribution for which convexity of
the energy landscape emerges in the continuum limit.
Related papers
- Kinetic Interacting Particle Langevin Monte Carlo [0.0]
This paper introduces and analyses interacting underdamped Langevin algorithms, for statistical inference in latent variable models.
We propose a diffusion process that evolves jointly in the space of parameters and latent variables.
We provide two explicit discretisations of this diffusion as practical algorithms to estimate parameters of statistical models.
arXiv Detail & Related papers (2024-07-08T09:52:46Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Interacting Particle Langevin Algorithm for Maximum Marginal Likelihood
Estimation [2.53740603524637]
We develop a class of interacting particle systems for implementing a maximum marginal likelihood estimation procedure.
In particular, we prove that the parameter marginal of the stationary measure of this diffusion has the form of a Gibbs measure.
Using a particular rescaling, we then prove geometric ergodicity of this system and bound the discretisation error.
in a manner that is uniform in time and does not increase with the number of particles.
arXiv Detail & Related papers (2023-03-23T16:50:08Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - Probability flow solution of the Fokker-Planck equation [10.484851004093919]
We introduce an alternative scheme based on integrating an ordinary differential equation that describes the flow of probability.
Unlike the dynamics, this equation deterministically pushes samples from the initial density onto samples from the solution at any later time.
Our approach is based on recent advances in score-based diffusion for generative modeling.
arXiv Detail & Related papers (2022-06-09T17:37:09Z) - Compressive Fourier collocation methods for high-dimensional diffusion
equations with periodic boundary conditions [7.80387197350208]
High-dimensional Partial Differential Equations (PDEs) are a popular mathematical modelling tool, with applications ranging from finance to computational chemistry.
Standard numerical techniques for solving these PDEs are typically affected by the curse of dimensionality.
Inspired by recent progress in sparse function approximation in high dimensions, we propose a new method called compressive Fourier collocation.
arXiv Detail & Related papers (2022-06-02T19:11:27Z) - Large-Scale Wasserstein Gradient Flows [84.73670288608025]
We introduce a scalable scheme to approximate Wasserstein gradient flows.
Our approach relies on input neural networks (ICNNs) to discretize the JKO steps.
As a result, we can sample from the measure at each step of the gradient diffusion and compute its density.
arXiv Detail & Related papers (2021-06-01T19:21:48Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.