Piecewise Normalizing Flows
- URL: http://arxiv.org/abs/2305.02930v2
- Date: Thu, 1 Feb 2024 12:06:30 GMT
- Title: Piecewise Normalizing Flows
- Authors: Harry Bevins, Will Handley, Thomas Gessey-Jones
- Abstract summary: A mismatch between the topology of the target and the base can result in a poor performance.
A number of different works have attempted to modify the topology of the base distribution to better match the target.
We introduce piecewise normalizing flows which divide the target distribution into clusters, with topologies that better match the standard normal base distribution.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are an established approach for modelling complex
probability densities through invertible transformations from a base
distribution. However, the accuracy with which the target distribution can be
captured by the normalizing flow is strongly influenced by the topology of the
base distribution. A mismatch between the topology of the target and the base
can result in a poor performance, as is typically the case for multi-modal
problems. A number of different works have attempted to modify the topology of
the base distribution to better match the target, either through the use of
Gaussian Mixture Models (Izmailov et al., 2020; Ardizzone et al., 2020;
Hagemann & Neumayer, 2021) or learned accept/reject sampling (Stimper et al.,
2022). We introduce piecewise normalizing flows which divide the target
distribution into clusters, with topologies that better match the standard
normal base distribution, and train a series of flows to model complex
multi-modal targets. We demonstrate the performance of the piecewise flows
using some standard benchmarks and compare the accuracy of the flows to the
approach taken in Stimper et al. (2022) for modelling multi-modal
distributions. We find that our approach consistently outperforms the approach
in Stimper et al. (2022) with a higher emulation accuracy on the standard
benchmarks.
Related papers
- Straightness of Rectified Flow: A Theoretical Insight into Wasserstein Convergence [54.580605276017096]
Rectified Flow (RF) aims to learn straight flow trajectories from noise to data using a sequence of convex optimization problems.
RF theoretically straightens the trajectory through successive rectifications, reducing the number of evaluations function (NFEs) while sampling.
We provide the first theoretical analysis of the Wasserstein distance between the sampling distribution of RF and the target distribution.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Marginalization Consistent Mixture of Separable Flows for Probabilistic Irregular Time Series Forecasting [4.714246221974192]
We develop a novel probabilistic irregular time series forecasting model, Marginalization Consistent Mixtures of Separable Flows (moses)
moses outperforms other state-of-the-art marginalization consistent models, performs on par with ProFITi, but different from ProFITi, guarantee marginalization consistency.
arXiv Detail & Related papers (2024-06-11T13:28:43Z) - Normalizing flow sampling with Langevin dynamics in the latent space [12.91637880428221]
Normalizing flows (NF) use a continuous generator to map a simple latent (e.g. Gaussian) distribution, towards an empirical target distribution associated with a training data set.
Since standard NF implement differentiable maps, they may suffer from pathological behaviors when targeting complex distributions.
This paper proposes a new Markov chain Monte Carlo algorithm to sample from the target distribution in the latent domain before transporting it back to the target domain.
arXiv Detail & Related papers (2023-05-20T09:31:35Z) - Building Normalizing Flows with Stochastic Interpolants [11.22149158986164]
A simple generative quadratic model based on a continuous-time normalizing flow between any pair of base and target distributions is proposed.
The velocity field of this flow is inferred from the probability current of a time-dependent distribution that interpolates between the base and the target in finite time.
arXiv Detail & Related papers (2022-09-30T16:30:31Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - KL Guided Domain Adaptation [88.19298405363452]
Domain adaptation is an important problem and often needed for real-world applications.
A common approach in the domain adaptation literature is to learn a representation of the input that has the same distributions over the source and the target domain.
We show that with a probabilistic representation network, the KL term can be estimated efficiently via minibatch samples.
arXiv Detail & Related papers (2021-06-14T22:24:23Z) - Evaluating State-of-the-Art Classification Models Against Bayes
Optimality [106.50867011164584]
We show that we can compute the exact Bayes error of generative models learned using normalizing flows.
We use our approach to conduct a thorough investigation of state-of-the-art classification models.
arXiv Detail & Related papers (2021-06-07T06:21:20Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.