Gaussianization Flows
- URL: http://arxiv.org/abs/2003.01941v1
- Date: Wed, 4 Mar 2020 08:15:06 GMT
- Title: Gaussianization Flows
- Authors: Chenlin Meng, Yang Song, Jiaming Song and Stefano Ermon
- Abstract summary: We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
- Score: 113.79542218282282
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Iterative Gaussianization is a fixed-point iteration procedure that can
transform any continuous random vector into a Gaussian one. Based on iterative
Gaussianization, we propose a new type of normalizing flow model that enables
both efficient computation of likelihoods and efficient inversion for sample
generation. We demonstrate that these models, named Gaussianization flows, are
universal approximators for continuous probability distributions under some
regularity conditions. Because of this guaranteed expressivity, they can
capture multimodal target distributions without compromising the efficiency of
sample generation. Experimentally, we show that Gaussianization flows achieve
better or comparable performance on several tabular datasets compared to other
efficiently invertible flow models such as Real NVP, Glow and FFJORD. In
particular, Gaussianization flows are easier to initialize, demonstrate better
robustness with respect to different transformations of the training data, and
generalize better on small training sets.
Related papers
- Adaptivity and Convergence of Probability Flow ODEs in Diffusion Generative Models [5.064404027153094]
This paper contributes to establishing theoretical guarantees for the probability flow ODE, a diffusion-based sampler known for its practical efficiency.
We demonstrate that, with accurate score function estimation, the probability flow ODE sampler achieves a convergence rate of $O(k/T)$ in total variation distance.
This dimension-free convergence rate improves upon existing results that scale with the typically much larger ambient dimension.
arXiv Detail & Related papers (2025-01-31T03:10:10Z) - 2-Rectifications are Enough for Straight Flows: A Theoretical Insight into Wasserstein Convergence [54.580605276017096]
We provide the first theoretical analysis of the Wasserstein distance between the sampling distribution of Rectified Flow and the target distribution.
We show that for a rectified flow from a Gaussian to any general target distribution with finite first moment, two rectifications are sufficient to achieve a straight flow.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - Efficient, Multimodal, and Derivative-Free Bayesian Inference With Fisher-Rao Gradient Flows [10.153270126742369]
We study efficient approximate sampling for probability distributions known up to normalization constants.
We specifically focus on a problem class arising in Bayesian inference for large-scale inverse problems in science and engineering applications.
arXiv Detail & Related papers (2024-06-25T04:07:22Z) - Marginalization Consistent Mixture of Separable Flows for Probabilistic Irregular Time Series Forecasting [4.714246221974192]
We develop a novel probabilistic irregular time series forecasting model, Marginalization Consistent Mixtures of Separable Flows (moses)
moses outperforms other state-of-the-art marginalization consistent models, performs on par with ProFITi, but different from ProFITi, guarantee marginalization consistency.
arXiv Detail & Related papers (2024-06-11T13:28:43Z) - Learning Mixtures of Gaussians Using Diffusion Models [9.118706387430883]
We give a new algorithm for learning mixtures of $k$ Gaussians to TV error.
Our approach is analytic and relies on the framework of diffusion models.
arXiv Detail & Related papers (2024-04-29T17:00:20Z) - MGF: Mixed Gaussian Flow for Diverse Trajectory Prediction [72.70572835589158]
We propose constructing a mixed Gaussian prior for a normalizing flow model for trajectory prediction.
Our method achieves state-of-the-art performance in the evaluation of both trajectory alignment and diversity on the popular UCY/ETH and SDD datasets.
arXiv Detail & Related papers (2024-02-19T15:48:55Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - Forward Operator Estimation in Generative Models with Kernel Transfer
Operators [37.999297683250575]
We show that our formulation enables highly efficient distribution approximation and sampling, and offers surprisingly good empirical performance.
We also show that the algorithm also performs well in small sample size settings (in brain imaging)
arXiv Detail & Related papers (2021-12-01T06:54:31Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.