On Sampling with Approximate Transport Maps
- URL: http://arxiv.org/abs/2302.04763v3
- Date: Sun, 18 Feb 2024 17:56:53 GMT
- Title: On Sampling with Approximate Transport Maps
- Authors: Louis Grenioux, Alain Durmus, \'Eric Moulines, Marylou Gabri\'e
- Abstract summary: Transport maps can ease the sampling of distributions with non-trivial geometries by transforming them into distributions that are easier to handle.
The potential of this approach has risen with the development of Normalizing Flows (NF) which are maps parameterized with deep neural networks trained to push a reference distribution towards a target.
NF-enhanced samplers recently proposed blend (Markov chain) Monte Carlo methods with either (i) proposal draws from the flow or (ii) a flow-based reparametrization.
- Score: 22.03230737620495
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Transport maps can ease the sampling of distributions with non-trivial
geometries by transforming them into distributions that are easier to handle.
The potential of this approach has risen with the development of Normalizing
Flows (NF) which are maps parameterized with deep neural networks trained to
push a reference distribution towards a target. NF-enhanced samplers recently
proposed blend (Markov chain) Monte Carlo methods with either (i) proposal
draws from the flow or (ii) a flow-based reparametrization. In both cases, the
quality of the learned transport conditions performance. The present work
clarifies for the first time the relative strengths and weaknesses of these two
approaches. Our study concludes that multimodal targets can be reliably handled
with flow-based proposals up to moderately high dimensions. In contrast,
methods relying on reparametrization struggle with multimodality but are more
robust otherwise in high-dimensional settings and under poor training. To
further illustrate the influence of target-proposal adequacy, we also derive a
new quantitative bound for the mixing time of the Independent
Metropolis-Hastings sampler.
Related papers
- Enhanced Importance Sampling through Latent Space Exploration in Normalizing Flows [69.8873421870522]
importance sampling is a rare event simulation technique used in Monte Carlo simulations.
We propose a method for more efficient sampling by updating the proposal distribution in the latent space of a normalizing flow.
arXiv Detail & Related papers (2025-01-06T21:18:02Z) - Sequential Controlled Langevin Diffusions [80.93988625183485]
Two popular methods are (1) Sequential Monte Carlo (SMC), where the transport is performed through successive densities via prescribed Markov chains and resampling steps, and (2) recently developed diffusion-based sampling methods, where a learned dynamical transport is used.
We present a principled framework for combining SMC with diffusion-based samplers by viewing both methods in continuous time and considering measures on path space.
This culminates in the new Sequential Controlled Langevin Diffusion (SCLD) sampling method, which is able to utilize the benefits of both methods and reaches improved performance on multiple benchmark problems, in many cases using only 10% of the training budget of previous diffusion-
arXiv Detail & Related papers (2024-12-10T00:47:10Z) - Learned Reference-based Diffusion Sampling for multi-modal distributions [2.1383136715042417]
We introduce Learned Reference-based Diffusion Sampler (LRDS), a methodology specifically designed to leverage prior knowledge on the location of the target modes.
LRDS proceeds in two steps by learning a reference diffusion model on samples located in high-density space regions.
We experimentally demonstrate that LRDS best exploits prior knowledge on the target distribution compared to competing algorithms on a variety of challenging distributions.
arXiv Detail & Related papers (2024-10-25T10:23:34Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.
We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.
Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z) - Improving Transferability of Adversarial Examples via Bayesian Attacks [84.90830931076901]
We introduce a novel extension by incorporating the Bayesian formulation into the model input as well, enabling the joint diversification of both the model input and model parameters.
Our method achieves a new state-of-the-art on transfer-based attacks, improving the average success rate on ImageNet and CIFAR-10 by 19.14% and 2.08%, respectively.
arXiv Detail & Related papers (2023-07-21T03:43:07Z) - Efficient Multimodal Sampling via Tempered Distribution Flow [11.36635610546803]
We develop a new type of transport-based sampling method called TemperFlow.
Various experiments demonstrate the superior performance of this novel sampler compared to traditional methods.
We show its applications in modern deep learning tasks such as image generation.
arXiv Detail & Related papers (2023-04-08T06:40:06Z) - Transport Reversible Jump Proposals [0.8399688944263843]
We show an approach to enhance the efficiency of RJMCMC sampling by performing transdimensional jumps involving reference distributions.
It is shown that, in the setting where exact transports are used, our RJMCMC proposals have the desirable property that the acceptance probability depends only on the model probabilities.
arXiv Detail & Related papers (2022-10-22T23:48:04Z) - Learning Optimal Transport Between two Empirical Distributions with
Normalizing Flows [12.91637880428221]
We propose to leverage the flexibility of neural networks to learn an approximate optimal transport map.
We show that a particular instance of invertible neural networks, namely the normalizing flows, can be used to approximate the solution of this OT problem.
arXiv Detail & Related papers (2022-07-04T08:08:47Z) - Distributionally Robust Models with Parametric Likelihood Ratios [123.05074253513935]
Three simple ideas allow us to train models with DRO using a broader class of parametric likelihood ratios.
We find that models trained with the resulting parametric adversaries are consistently more robust to subpopulation shifts when compared to other DRO approaches.
arXiv Detail & Related papers (2022-04-13T12:43:12Z) - Robust model training and generalisation with Studentising flows [22.757298187704745]
We discuss how these methods can be further improved based on insights from robust (in particular, resistant) statistics.
We propose to endow flow-based models with fat-tailed latent distributions as a simple drop-in replacement for the Gaussian distribution.
Experiments on several different datasets confirm the efficacy of the proposed approach.
arXiv Detail & Related papers (2020-06-11T16:47:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.