Switched Flow Matching: Eliminating Singularities via Switching ODEs
- URL: http://arxiv.org/abs/2405.11605v2
- Date: Thu, 23 May 2024 13:42:02 GMT
- Title: Switched Flow Matching: Eliminating Singularities via Switching ODEs
- Authors: Qunxi Zhu, Wei Lin,
- Abstract summary: Continuous-time generative models, such as Flow Matching (FM), construct probability paths to transport between one distribution and another.
During inference, however, the learned model often requires multiple neural network evaluations to accurately integrate the flow.
We propose Switched FM (SFM), that eliminates singularities via switching ODEs, as opposed to using a uniform ODE in FM.
- Score: 12.273757042838675
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Continuous-time generative models, such as Flow Matching (FM), construct probability paths to transport between one distribution and another through the simulation-free learning of the neural ordinary differential equations (ODEs). During inference, however, the learned model often requires multiple neural network evaluations to accurately integrate the flow, resulting in a slow sampling speed. We attribute the reason to the inherent (joint) heterogeneity of source and/or target distributions, namely the singularity problem, which poses challenges for training the neural ODEs effectively. To address this issue, we propose a more general framework, termed Switched FM (SFM), that eliminates singularities via switching ODEs, as opposed to using a uniform ODE in FM. Importantly, we theoretically show that FM cannot transport between two simple distributions due to the existence and uniqueness of initial value problems of ODEs, while these limitations can be well tackled by SFM. From an orthogonal perspective, our framework can seamlessly integrate with the existing advanced techniques, such as minibatch optimal transport, to further enhance the straightness of the flow, yielding a more efficient sampling process with reduced costs. We demonstrate the effectiveness of the newly proposed SFM through several numerical examples.
Related papers
- DFM: Interpolant-free Dual Flow Matching [0.8192907805418583]
We propose an interpolant-free dual flow matching (DFM) approach without explicit assumptions about the modeled vector field.
Experiments with the SMAP unsupervised anomaly detection show advantages of DFM when compared to the CNF trained with either maximum likelihood or FM objectives.
arXiv Detail & Related papers (2024-10-11T20:46:04Z) - Local Flow Matching Generative Models [19.859984725284896]
Flow Matching (FM) is a simulation-free method for learning a continuous and invertible flow to interpolate between two distributions.
We introduce Local Flow Matching (LFM), which learns a sequence of FM sub-models and each matches a diffusion process up to the time of the step size in the data-to-noise direction.
In experiments, we demonstrate the improved training efficiency and competitive generative performance of LFM compared to FM.
arXiv Detail & Related papers (2024-10-03T14:53:10Z) - FFHFlow: A Flow-based Variational Approach for Multi-fingered Grasp Synthesis in Real Time [19.308304984645684]
We propose exploiting a special kind of Deep Generative Model (DGM) based on Normalizing Flows (NFs)
We first observed an encouraging improvement in diversity by directly applying a single conditional NFs (cNFs) to learn a grasp distribution conditioned on the incomplete point cloud.
This motivated us to develop a novel flow-based d Deep Latent Variable Model (DLVM)
Unlike Variational Autoencoders (VAEs), the proposed DLVM counteracts typical pitfalls by leveraging two cNFs for the prior and likelihood distributions.
arXiv Detail & Related papers (2024-07-21T13:33:08Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Flow matching achieves almost minimax optimal convergence [50.38891696297888]
Flow matching (FM) has gained significant attention as a simulation-free generative model.
This paper discusses the convergence properties of FM for large sample size under the $p$-Wasserstein distance.
We establish that FM can achieve an almost minimax optimal convergence rate for $1 leq p leq 2$, presenting the first theoretical evidence that FM can reach convergence rates comparable to those of diffusion models.
arXiv Detail & Related papers (2024-05-31T14:54:51Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Neural Flow Diffusion Models: Learnable Forward Process for Improved Diffusion Modelling [2.1779479916071067]
We introduce a novel framework that enhances diffusion models by supporting a broader range of forward processes.
We also propose a novel parameterization technique for learning the forward process.
Results underscore NFDM's versatility and its potential for a wide range of applications.
arXiv Detail & Related papers (2024-04-19T15:10:54Z) - Consistency Trajectory Models: Learning Probability Flow ODE Trajectory of Diffusion [56.38386580040991]
Consistency Trajectory Model (CTM) is a generalization of Consistency Models (CM)
CTM enables the efficient combination of adversarial training and denoising score matching loss to enhance performance.
Unlike CM, CTM's access to the score function can streamline the adoption of established controllable/conditional generation methods.
arXiv Detail & Related papers (2023-10-01T05:07:17Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Influence Estimation and Maximization via Neural Mean-Field Dynamics [60.91291234832546]
We propose a novel learning framework using neural mean-field (NMF) dynamics for inference and estimation problems.
Our framework can simultaneously learn the structure of the diffusion network and the evolution of node infection probabilities.
arXiv Detail & Related papers (2021-06-03T00:02:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.