Resampling Base Distributions of Normalizing Flows
- URL: http://arxiv.org/abs/2110.15828v1
- Date: Fri, 29 Oct 2021 14:44:44 GMT
- Title: Resampling Base Distributions of Normalizing Flows
- Authors: Vincent Stimper, Bernhard Sch\"olkopf, Jos\'e Miguel
Hern\'andez-Lobato
- Abstract summary: We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are a popular class of models for approximating probability
distributions. However, their invertible nature limits their ability to model
target distributions with a complex topological structure, such as Boltzmann
distributions. Several procedures have been proposed to solve this problem but
many of them sacrifice invertibility and, thereby, tractability of the
log-likelihood as well as other desirable properties. To address these
limitations, we introduce a base distribution for normalizing flows based on
learned rejection sampling, allowing the resulting normalizing flow to model
complex topologies without giving up bijectivity. Furthermore, we develop
suitable learning algorithms using both maximizing the log-likelihood and the
optimization of the reverse Kullback-Leibler divergence, and apply them to
various sample problems, i.e.\ approximating 2D densities, density estimation
of tabular data, image generation, and modeling Boltzmann distributions. In
these experiments our method is competitive with or outperforms the baselines.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Out-of-distribution detection using normalizing flows on the data
manifold [3.725042082196983]
This study investigates the effect of manifold learning using normalizing flows on out-of-distribution detection.
We show that manifold learning improves the out-of-distribution detection ability of a class of likelihood-based models known as normalizing flows.
arXiv Detail & Related papers (2023-08-26T07:35:16Z) - Piecewise Normalizing Flows [0.0]
A mismatch between the topology of the target and the base can result in a poor performance.
A number of different works have attempted to modify the topology of the base distribution to better match the target.
We introduce piecewise normalizing flows which divide the target distribution into clusters, with topologies that better match the standard normal base distribution.
arXiv Detail & Related papers (2023-05-04T15:30:10Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Building Normalizing Flows with Stochastic Interpolants [11.22149158986164]
A simple generative quadratic model based on a continuous-time normalizing flow between any pair of base and target distributions is proposed.
The velocity field of this flow is inferred from the probability current of a time-dependent distribution that interpolates between the base and the target in finite time.
arXiv Detail & Related papers (2022-09-30T16:30:31Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Tractable Density Estimation on Learned Manifolds with Conformal
Embedding Flows [0.0]
Normalizing flows provide tractable density estimation by transforming a simple base distribution into a complex target distribution.
Recent attempts to remedy this have introduced geometric complications that defeat a central benefit of normalizing flows: exact density estimation.
We argue that composing a standard flow with a trainable conformal embedding is the most natural way to model manifold-supported data.
arXiv Detail & Related papers (2021-06-09T18:00:00Z) - Variational Mixture of Normalizing Flows [0.0]
Deep generative models, such as generative adversarial networks autociteGAN, variational autoencoders autocitevaepaper, and their variants, have seen wide adoption for the task of modelling complex data distributions.
Normalizing flows have overcome this limitation by leveraging the change-of-suchs formula for probability density functions.
The present work overcomes this by using normalizing flows as components in a mixture model and devising an end-to-end training procedure for such a model.
arXiv Detail & Related papers (2020-09-01T17:20:08Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.