Turning Normalizing Flows into Monge Maps with Geodesic Gaussian
Preserving Flows
- URL: http://arxiv.org/abs/2209.10873v4
- Date: Fri, 14 Apr 2023 05:31:18 GMT
- Title: Turning Normalizing Flows into Monge Maps with Geodesic Gaussian
Preserving Flows
- Authors: Guillaume Morel (IMT Atlantique - ITI), Lucas Drumetz (IMT Atlantique
- MEE, Lab-STICC\_OSE), Simon Bena\"ichouche (IMT Atlantique), Nicolas Courty
(IRISA, UBS), Fran\c{c}ois Rousseau (IMT Atlantique - ITI, LaTIM)
- Abstract summary: This paper introduces a method to transform any trained NF into a more OT-efficient version without changing the final density.
We do so by learning a rearrangement of the source (Gaussian) distribution that minimizes the OT cost between the source and the final density.
The proposed method leads to smooth flows with reduced OT cost for several existing models without affecting the model performance.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing Flows (NF) are powerful likelihood-based generative models that
are able to trade off between expressivity and tractability to model complex
densities. A now well established research avenue leverages optimal transport
(OT) and looks for Monge maps, i.e. models with minimal effort between the
source and target distributions. This paper introduces a method based on
Brenier's polar factorization theorem to transform any trained NF into a more
OT-efficient version without changing the final density. We do so by learning a
rearrangement of the source (Gaussian) distribution that minimizes the OT cost
between the source and the final density. We further constrain the path leading
to the estimated Monge map to lie on a geodesic in the space of
volume-preserving diffeomorphisms thanks to Euler's equations. The proposed
method leads to smooth flows with reduced OT cost for several existing models
without affecting the model performance.
Related papers
- Straightness of Rectified Flow: A Theoretical Insight into Wasserstein Convergence [54.580605276017096]
Diffusion models have emerged as a powerful tool for image generation and denoising.
Recently, Liu et al. designed a novel alternative generative model Rectified Flow (RF)
RF aims to learn straight flow trajectories from noise to data using a sequence of convex optimization problems.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - Strongly Isomorphic Neural Optimal Transport Across Incomparable Spaces [7.535219325248997]
We present a novel neural formulation of the Gromov-Monge problem rooted in one of its fundamental properties.
We operationalize this property by decomposing the learnable OT map into two components.
Our framework provides a promising approach to learn OT maps across diverse spaces.
arXiv Detail & Related papers (2024-07-20T18:27:11Z) - Amortizing intractable inference in diffusion models for vision, language, and control [89.65631572949702]
This paper studies amortized sampling of the posterior over data, $mathbfxsim prm post(mathbfx)propto p(mathbfx)r(mathbfx)$, in a model that consists of a diffusion generative model prior $p(mathbfx)$ and a black-box constraint or function $r(mathbfx)$.
We prove the correctness of a data-free learning objective, relative trajectory balance, for training a diffusion model that samples from
arXiv Detail & Related papers (2024-05-31T16:18:46Z) - Mixed Gaussian Flow for Diverse Trajectory Prediction [78.00204650749453]
We propose a flow-based model to transform a mixed Gaussian prior into the future trajectory manifold.
The model shows a better capacity for generating diverse trajectory patterns.
We also demonstrate that it can generate diverse, controllable, and out-of-distribution trajectories.
arXiv Detail & Related papers (2024-02-19T15:48:55Z) - Generative Modeling through the Semi-dual Formulation of Unbalanced
Optimal Transport [9.980822222343921]
We propose a novel generative model based on the semi-dual formulation of Unbalanced Optimal Transport (UOT)
Unlike OT, UOT relaxes the hard constraint on distribution matching. This approach provides better robustness against outliers, stability during training, and faster convergence.
Our model outperforms existing OT-based generative models, achieving FID scores of 2.97 on CIFAR-10 and 6.36 on CelebA-HQ-256.
arXiv Detail & Related papers (2023-05-24T06:31:05Z) - Normalizing flow sampling with Langevin dynamics in the latent space [12.91637880428221]
Normalizing flows (NF) use a continuous generator to map a simple latent (e.g. Gaussian) distribution, towards an empirical target distribution associated with a training data set.
Since standard NF implement differentiable maps, they may suffer from pathological behaviors when targeting complex distributions.
This paper proposes a new Markov chain Monte Carlo algorithm to sample from the target distribution in the latent domain before transporting it back to the target domain.
arXiv Detail & Related papers (2023-05-20T09:31:35Z) - The Schr\"odinger Bridge between Gaussian Measures has a Closed Form [101.79851806388699]
We focus on the dynamic formulation of OT, also known as the Schr"odinger bridge (SB) problem.
In this paper, we provide closed-form expressions for SBs between Gaussian measures.
arXiv Detail & Related papers (2022-02-11T15:59:01Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Learning Implicit Generative Models with Theoretical Guarantees [12.761710596142109]
We propose a textbfunified textbfframework for textbfimplicit textbfmodeling (UnifiGem)
UnifiGem integrates approaches from optimal transport, numerical ODE, density-ratio (density-difference) estimation and deep neural networks.
Experimental results on both synthetic datasets and real benchmark datasets support our theoretical findings and demonstrate the effectiveness of UnifiGem.
arXiv Detail & Related papers (2020-02-07T15:55:48Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.