OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal
Transport
- URL: http://arxiv.org/abs/2006.00104v5
- Date: Tue, 23 Mar 2021 20:30:00 GMT
- Title: OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal
Transport
- Authors: Derek Onken, Samy Wu Fung, Xingjian Li, Lars Ruthotto
- Abstract summary: A normalizing flow is an invertible mapping between an arbitrary probability distribution and a standard normal distribution.
OT-Flow tackles two critical computational challenges that limit a more widespread use of CNFs.
On five high-dimensional density estimation and generative modeling tasks, OT-Flow performs competitively to state-of-the-art CNFs.
- Score: 8.468007443062751
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A normalizing flow is an invertible mapping between an arbitrary probability
distribution and a standard normal distribution; it can be used for density
estimation and statistical inference. Computing the flow follows the change of
variables formula and thus requires invertibility of the mapping and an
efficient way to compute the determinant of its Jacobian. To satisfy these
requirements, normalizing flows typically consist of carefully chosen
components. Continuous normalizing flows (CNFs) are mappings obtained by
solving a neural ordinary differential equation (ODE). The neural ODE's
dynamics can be chosen almost arbitrarily while ensuring invertibility.
Moreover, the log-determinant of the flow's Jacobian can be obtained by
integrating the trace of the dynamics' Jacobian along the flow. Our proposed
OT-Flow approach tackles two critical computational challenges that limit a
more widespread use of CNFs. First, OT-Flow leverages optimal transport (OT)
theory to regularize the CNF and enforce straight trajectories that are easier
to integrate. Second, OT-Flow features exact trace computation with time
complexity equal to trace estimators used in existing CNFs. On five
high-dimensional density estimation and generative modeling tasks, OT-Flow
performs competitively to state-of-the-art CNFs while on average requiring
one-fourth of the number of weights with an 8x speedup in training time and 24x
speedup in inference.
Related papers
- Entropy-Informed Weighting Channel Normalizing Flow [7.751853409569806]
We propose a regularized and feature-dependent $mathttShuffle$ operation and integrate it into vanilla multi-scale architecture.
We observe that such operation guides the variables to evolve in the direction of entropy increase, hence we refer to NFs with the $mathttShuffle$ operation as emphEntropy-Informed Weighting Channel Normalizing Flow (EIW-Flow)
arXiv Detail & Related papers (2024-07-06T04:46:41Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - Taming Hyperparameter Tuning in Continuous Normalizing Flows Using the
JKO Scheme [60.79981399724534]
A normalizing flow (NF) is a mapping that transforms a chosen probability distribution to a normal distribution.
We present JKO-Flow, an algorithm to solve OT-based CNF without the need of tuning $alpha$.
arXiv Detail & Related papers (2022-11-30T05:53:21Z) - Flow Matching for Generative Modeling [44.66897082688762]
Flow Matching is a simulation-free approach for training Continuous Normalizing Flows (CNFs)
We find that employing FM with diffusion paths results in a more robust and stable alternative for training diffusion models.
Training CNFs using Flow Matching on ImageNet leads to state-of-the-art performance in terms of both likelihood and sample quality.
arXiv Detail & Related papers (2022-10-06T08:32:20Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - GMFlow: Learning Optical Flow via Global Matching [124.57850500778277]
We propose a GMFlow framework for learning optical flow estimation.
It consists of three main components: a customized Transformer for feature enhancement, a correlation and softmax layer for global feature matching, and a self-attention layer for flow propagation.
Our new framework outperforms 32-iteration RAFT's performance on the challenging Sintel benchmark.
arXiv Detail & Related papers (2021-11-26T18:59:56Z) - Convex Potential Flows: Universal Probability Distributions with Optimal
Transport and Convex Optimization [8.683116789109462]
This paper introduces Convex Potential Flows (CP-Flow), a natural and efficient parameterization of invertible models.
CP-Flows are the gradient map of a strongly convex neural potential function.
We show that CP-Flow performs competitively on standard benchmarks of density estimation and variational inference.
arXiv Detail & Related papers (2020-12-10T19:36:34Z) - Self Normalizing Flows [65.73510214694987]
We propose a flexible framework for training normalizing flows by replacing expensive terms in the gradient by learned approximate inverses at each layer.
This reduces the computational complexity of each layer's exact update from $mathcalO(D3)$ to $mathcalO(D2)$.
We show experimentally that such models are remarkably stable and optimize to similar data likelihood values as their exact gradient counterparts.
arXiv Detail & Related papers (2020-11-14T09:51:51Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.