Dynamic Conditional Optimal Transport through Simulation-Free Flows
- URL: http://arxiv.org/abs/2404.04240v2
- Date: Fri, 31 May 2024 17:43:54 GMT
- Title: Dynamic Conditional Optimal Transport through Simulation-Free Flows
- Authors: Gavin Kerrigan, Giosue Migliorini, Padhraic Smyth,
- Abstract summary: We study the geometry of conditional optimal transport (COT) and prove a dynamical formulation which generalizes the Benamou-Brenier Theorem.
We propose a simulation-free flow-based method for conditional generative modeling.
- Score: 12.976042923229466
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study the geometry of conditional optimal transport (COT) and prove a dynamical formulation which generalizes the Benamou-Brenier Theorem. Equipped with these tools, we propose a simulation-free flow-based method for conditional generative modeling. Our method couples an arbitrary source distribution to a specified target distribution through a triangular COT plan, and a conditional generative model is obtained by approximating the geodesic path of measures induced by this COT plan. Our theory and methods are applicable in infinite-dimensional settings, making them well suited for a wide class of Bayesian inverse problems. Empirically, we demonstrate that our method is competitive on several challenging conditional generation tasks, including an infinite-dimensional inverse problem.
Related papers
- Metric Flow Matching for Smooth Interpolations on the Data Manifold [40.24392451848883]
Metric Flow Matching (MFM) is a novel simulation-free framework for conditional flow matching.
We propose MFM as a framework for conditional paths that transform a source distribution into a target distribution.
We test MFM on a suite of challenges including LiDAR navigation, unpaired image translation, and modeling cellular dynamics.
arXiv Detail & Related papers (2024-05-23T16:48:06Z) - Constrained Synthesis with Projected Diffusion Models [47.56192362295252]
This paper introduces an approach to generative diffusion processes the ability to satisfy and certify compliance with constraints and physical principles.
The proposed method recast the traditional process of generative diffusion as a constrained distribution problem to ensure adherence to constraints.
arXiv Detail & Related papers (2024-02-05T22:18:16Z) - Bayesian Conditional Diffusion Models for Versatile Spatiotemporal
Turbulence Generation [13.278744447861289]
We introduce a novel generative framework grounded in probabilistic diffusion models for turbulence generation.
A notable feature of our approach is the proposed method for long-span flow sequence generation, which is based on autoregressive-based conditional sampling.
We showcase the versatile turbulence generation capability of our framework through a suite of numerical experiments.
arXiv Detail & Related papers (2023-11-14T04:08:14Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - A generative flow for conditional sampling via optimal transport [1.0486135378491266]
This work proposes a non-parametric generative model that iteratively maps reference samples to the target.
The model uses block-triangular transport maps, whose components are shown to characterize conditionals of the target distribution.
These maps arise from solving an optimal transport problem with a weighted $L2$ cost function, thereby extending the data-driven approach in [Trigila and Tabak, 2016] for conditional sampling.
arXiv Detail & Related papers (2023-07-09T05:36:26Z) - Flow Matching on General Geometries [43.252817099263744]
We propose a simple yet powerful framework for training continuous normalizing flows on manifold geometries.
We show that it is simulation-free on simple geometries, does not require divergence, and computes its target vector field in closed-form.
Our method achieves state-of-the-art performance on many real-world non-Euclidean datasets.
arXiv Detail & Related papers (2023-02-07T18:21:24Z) - Distributed Bayesian Learning of Dynamic States [65.7870637855531]
The proposed algorithm is a distributed Bayesian filtering task for finite-state hidden Markov models.
It can be used for sequential state estimation, as well as for modeling opinion formation over social networks under dynamic environments.
arXiv Detail & Related papers (2022-12-05T19:40:17Z) - Gauge-equivariant flow models for sampling in lattice field theories
with pseudofermions [51.52945471576731]
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as estimators for the fermionic determinant.
This is the default approach in state-of-the-art lattice field theory calculations, making this development critical to the practical application of flow models to theories such as QCD.
arXiv Detail & Related papers (2022-07-18T21:13:34Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - Conditional Sampling with Monotone GANs: from Generative Models to
Likelihood-Free Inference [4.913013713982677]
We present a novel framework for conditional sampling of probability measures, using block triangular transport maps.
We develop the theoretical foundations of block triangular transport in a Banach space setting.
We then introduce a computational approach, called monotone generative adversarial networks, to learn suitable block triangular maps.
arXiv Detail & Related papers (2020-06-11T19:15:43Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.