Conditional Variable Flow Matching: Transforming Conditional Densities with Amortized Conditional Optimal Transport
- URL: http://arxiv.org/abs/2411.08314v4
- Date: Tue, 01 Apr 2025 02:30:12 GMT
- Title: Conditional Variable Flow Matching: Transforming Conditional Densities with Amortized Conditional Optimal Transport
- Authors: Adam P. Generale, Andreas E. Robertson, Surya R. Kalidindi,
- Abstract summary: We propose a framework for learning flows transforming conditional distributions with amortization across continuous conditioning variables.<n>In particular, simultaneous sample conditioned flows over the main and conditioning variables, alongside a conditional Wasserstein distance combined with a loss reweighting kernel conditional optimal transport.<n>We demonstrate CVFM on a suite of increasingly challenging problems, including discrete and continuous conditional mapping benchmarks, image-to-image domain transfer, and modeling the temporal evolution of materials internal structure during manufacturing processes.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Forecasting conditional stochastic nonlinear dynamical systems is a fundamental challenge repeatedly encountered across the biological and physical sciences. While flow-based models can impressively predict the temporal evolution of probability distributions representing possible outcomes of a specific process, existing frameworks cannot satisfactorily account for the impact of conditioning variables on these dynamics. Amongst several limitations, existing methods require training data with paired conditions and are developed for discrete conditioning variables. We propose Conditional Variable Flow Matching (CVFM), a framework for learning flows transforming conditional distributions with amortization across continuous conditioning variables - permitting predictions across the conditional density manifold. This is accomplished through several novel advances. In particular, simultaneous sample conditioned flows over the main and conditioning variables, alongside a conditional Wasserstein distance combined with a loss reweighting kernel facilitating conditional optimal transport. Collectively, these advances allow for learning system dynamics provided measurement data whose states and conditioning variables are not in correspondence. We demonstrate CVFM on a suite of increasingly challenging problems, including discrete and continuous conditional mapping benchmarks, image-to-image domain transfer, and modeling the temporal evolution of materials internal structure during manufacturing processes. We observe that CVFM results in improved performance and convergence characteristics over alternative conditional variants.
Related papers
- Probabilistic Forecasting via Autoregressive Flow Matching [1.5467259918426441]
FlowTime is a generative model for probabilistic forecasting of timeseries data.
We decompose the joint distribution of future observations into a sequence of conditional densities, each modeled via a shared flow.
We demonstrate the effectiveness of FlowTime on multiple dynamical systems and real-world forecasting tasks.
arXiv Detail & Related papers (2025-03-13T13:54:24Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Sequential Representation Learning via Static-Dynamic Conditional Disentanglement [58.19137637859017]
This paper explores self-supervised disentangled representation learning within sequential data, focusing on separating time-independent and time-varying factors in videos.
We propose a new model that breaks the usual independence assumption between those factors by explicitly accounting for the causal relationship between the static/dynamic variables.
Experiments show that the proposed approach outperforms previous complex state-of-the-art techniques in scenarios where the dynamics of a scene are influenced by its content.
arXiv Detail & Related papers (2024-08-10T17:04:39Z) - FUSE: Fast Unified Simulation and Estimation for PDEs [11.991297011923004]
We argue that solving both problems within the same framework can lead to consistent gains in accuracy and robustness.
We present the capabilities of the proposed methodology for predicting continuous and discrete biomarkers in full-body haemodynamics simulations.
arXiv Detail & Related papers (2024-05-23T13:37:26Z) - Conditional Pseudo-Reversible Normalizing Flow for Surrogate Modeling in Quantifying Uncertainty Propagation [11.874729463016227]
We introduce a conditional pseudo-reversible normalizing flow for constructing surrogate models of a physical model polluted by additive noise.
The training process utilizes dataset consisting of input-output pairs without requiring prior knowledge about the noise and the function.
Our model, once trained, can generate samples from any conditional probability density functions whose high probability regions are covered by the training set.
arXiv Detail & Related papers (2024-03-31T00:09:58Z) - Extended Flow Matching: a Method of Conditional Generation with Generalized Continuity Equation [19.71452214879951]
conditional generation is one of the most important applications of generative models.
We show that we can introduce inductive bias to the conditional generation through the matrix field.
We will present our theory along with experimental results that support the competitiveness of EFM in conditional generation.
arXiv Detail & Related papers (2024-02-29T04:12:32Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Benchmarking Autoregressive Conditional Diffusion Models for Turbulent
Flow Simulation [29.806100463356906]
We analyze if fully data-driven fluid solvers that utilize an autoregressive rollout based on conditional diffusion models are a viable option.
We investigate accuracy, posterior sampling, spectral behavior, and temporal stability, while requiring that methods generalize to flow parameters beyond the training regime.
We find that even simple diffusion-based approaches can outperform multiple established flow prediction methods in terms of accuracy and temporal stability, while being on par with state-of-the-art stabilization techniques like unrolling at training time.
arXiv Detail & Related papers (2023-09-04T18:01:42Z) - ShiftDDPMs: Exploring Conditional Diffusion Models by Shifting Diffusion
Trajectories [144.03939123870416]
We propose a novel conditional diffusion model by introducing conditions into the forward process.
We use extra latent space to allocate an exclusive diffusion trajectory for each condition based on some shifting rules.
We formulate our method, which we call textbfShiftDDPMs, and provide a unified point of view on existing related methods.
arXiv Detail & Related papers (2023-02-05T12:48:21Z) - Conditional Permutation Invariant Flows [23.740061786510417]
We present a conditional generative probabilistic model of set-valued data with a tractable log density.
These dynamics are driven by a learnable per-set-element term and pairwise interactions, both parametrized by deep neural networks.
We illustrate the utility of this model via applications including (1) complex traffic scene generation conditioned on visually specified map information, and (2) object bounding box generation conditioned directly on images.
arXiv Detail & Related papers (2022-06-17T21:43:38Z) - Likelihood-Free Inference in State-Space Models with Unknown Dynamics [71.94716503075645]
We introduce a method for inferring and predicting latent states in state-space models where observations can only be simulated, and transition dynamics are unknown.
We propose a way of doing likelihood-free inference (LFI) of states and state prediction with a limited number of simulations.
arXiv Detail & Related papers (2021-11-02T12:33:42Z) - Autoregressive Score Matching [113.4502004812927]
We propose autoregressive conditional score models (AR-CSM) where we parameterize the joint distribution in terms of the derivatives of univariable log-conditionals (scores)
For AR-CSM models, this divergence between data and model distributions can be computed and optimized efficiently, requiring no expensive sampling or adversarial training.
We show with extensive experimental results that it can be applied to density estimation on synthetic data, image generation, image denoising, and training latent variable models with implicit encoders.
arXiv Detail & Related papers (2020-10-24T07:01:24Z) - Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows [40.9137348900942]
We propose a novel type of flow driven by a differential deformation of the Wiener process.
As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process.
arXiv Detail & Related papers (2020-02-24T20:13:43Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.