Flow Matching for Generative Modeling
- URL: http://arxiv.org/abs/2210.02747v1
- Date: Thu, 6 Oct 2022 08:32:20 GMT
- Title: Flow Matching for Generative Modeling
- Authors: Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, Matt
Le
- Abstract summary: Flow Matching is a simulation-free approach for training Continuous Normalizing Flows (CNFs)
We find that employing FM with diffusion paths results in a more robust and stable alternative for training diffusion models.
Training CNFs using Flow Matching on ImageNet leads to state-of-the-art performance in terms of both likelihood and sample quality.
- Score: 44.66897082688762
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a new paradigm for generative modeling built on Continuous
Normalizing Flows (CNFs), allowing us to train CNFs at unprecedented scale.
Specifically, we present the notion of Flow Matching (FM), a simulation-free
approach for training CNFs based on regressing vector fields of fixed
conditional probability paths. Flow Matching is compatible with a general
family of Gaussian probability paths for transforming between noise and data
samples -- which subsumes existing diffusion paths as specific instances.
Interestingly, we find that employing FM with diffusion paths results in a more
robust and stable alternative for training diffusion models. Furthermore, Flow
Matching opens the door to training CNFs with other, non-diffusion probability
paths. An instance of particular interest is using Optimal Transport (OT)
displacement interpolation to define the conditional probability paths. These
paths are more efficient than diffusion paths, provide faster training and
sampling, and result in better generalization. Training CNFs using Flow
Matching on ImageNet leads to state-of-the-art performance in terms of both
likelihood and sample quality, and allows fast and reliable sample generation
using off-the-shelf numerical ODE solvers.
Related papers
- Local Flow Matching Generative Models [19.859984725284896]
Flow Matching (FM) is a simulation-free method for learning a continuous and invertible flow to interpolate between two distributions.
We introduce Local Flow Matching (LFM), which learns a sequence of FM sub-models and each matches a diffusion process up to the time of the step size in the data-to-noise direction.
In experiments, we demonstrate the improved training efficiency and competitive generative performance of LFM compared to FM.
arXiv Detail & Related papers (2024-10-03T14:53:10Z) - Stream-level flow matching from a Bayesian decision theoretic perspective [4.935875591615496]
Flow matching (FM) is a family of training algorithms for fitting continuous normalizing flows (CNFs)
We show that viewing CFM training from a Bayesian decision theoretic perspective on parameter estimation opens the door to generalizations of CFM algorithms.
arXiv Detail & Related papers (2024-09-30T15:47:22Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - Markovian Flow Matching: Accelerating MCMC with Continuous Normalizing Flows [2.2530496464901106]
Continuous normalizing flows (CNFs) learn the probability path between a reference distribution and a target distribution by modeling the vector field generating said path using neural networks.
Recently, Lipman et al. (2022) introduced a simple and inexpensive method for training CNFs in generative modeling, termed flow matching (FM)
In this paper, we repurpose this method for probabilistic inference by incorporating Markovian sampling methods in evaluating the FM objective, and using the learned CNF to improve Monte Carlo sampling.
arXiv Detail & Related papers (2024-05-23T10:08:19Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Matching Normalizing Flows and Probability Paths on Manifolds [57.95251557443005]
Continuous Normalizing Flows (CNFs) are generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE)
We propose to train CNFs by minimizing probability path divergence (PPD), a novel family of divergences between the probability density path generated by the CNF and a target probability density path.
We show that CNFs learned by minimizing PPD achieve state-of-the-art results in likelihoods and sample quality on existing low-dimensional manifold benchmarks.
arXiv Detail & Related papers (2022-07-11T08:50:19Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.