Efficient Multimodal Sampling via Tempered Distribution Flow
- URL: http://arxiv.org/abs/2304.03933v1
- Date: Sat, 8 Apr 2023 06:40:06 GMT
- Title: Efficient Multimodal Sampling via Tempered Distribution Flow
- Authors: Yixuan Qiu, Xiao Wang
- Abstract summary: We develop a new type of transport-based sampling method called TemperFlow.
Various experiments demonstrate the superior performance of this novel sampler compared to traditional methods.
We show its applications in modern deep learning tasks such as image generation.
- Score: 11.36635610546803
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sampling from high-dimensional distributions is a fundamental problem in
statistical research and practice. However, great challenges emerge when the
target density function is unnormalized and contains isolated modes. We tackle
this difficulty by fitting an invertible transformation mapping, called a
transport map, between a reference probability measure and the target
distribution, so that sampling from the target distribution can be achieved by
pushing forward a reference sample through the transport map. We theoretically
analyze the limitations of existing transport-based sampling methods using the
Wasserstein gradient flow theory, and propose a new method called TemperFlow
that addresses the multimodality issue. TemperFlow adaptively learns a sequence
of tempered distributions to progressively approach the target distribution,
and we prove that it overcomes the limitations of existing methods. Various
experiments demonstrate the superior performance of this novel sampler compared
to traditional methods, and we show its applications in modern deep learning
tasks such as image generation. The programming code for the numerical
experiments is available at https://github.com/yixuan/temperflow.
Related papers
- Stochastic Sampling from Deterministic Flow Models [8.849981177332594]
We present a method to turn flow models into a family of differential equations (SDEs) that have the same marginal distributions.
We empirically demonstrate advantages of our method on a toy Gaussian setup and on the large scale ImageNet generation task.
arXiv Detail & Related papers (2024-10-03T05:18:28Z) - Annealing Flow Generative Model Towards Sampling High-Dimensional and Multi-Modal Distributions [6.992239210938067]
Annealing Flow is a continuous normalizing flow based approach designed to sample from high dimensional and multimodal distributions.
AF ensures effective and balanced mode exploration, achieves linear complexity in sample size and dimensions, and circumvents inefficient mixing times.
arXiv Detail & Related papers (2024-09-30T17:48:22Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.
We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.
Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Diffusion Generative Flow Samplers: Improving learning signals through
partial trajectory optimization [87.21285093582446]
Diffusion Generative Flow Samplers (DGFS) is a sampling-based framework where the learning process can be tractably broken down into short partial trajectory segments.
Our method takes inspiration from the theory developed for generative flow networks (GFlowNets)
arXiv Detail & Related papers (2023-10-04T09:39:05Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Detecting and Mitigating Mode-Collapse for Flow-based Sampling of
Lattice Field Theories [6.222204646855336]
We study the consequences of mode-collapse of normalizing flows in the context of lattice field theory.
We propose a metric to quantify the degree of mode-collapse and derive a bound on the resulting bias.
arXiv Detail & Related papers (2023-02-27T19:00:22Z) - Improving Diffusion Models for Inverse Problems using Manifold Constraints [55.91148172752894]
We show that current solvers throw the sample path off the data manifold, and hence the error accumulates.
To address this, we propose an additional correction term inspired by the manifold constraint.
We show that our method is superior to the previous methods both theoretically and empirically.
arXiv Detail & Related papers (2022-06-02T09:06:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.