On the Inverse Flow Matching Problem in the One-Dimensional and Gaussian Cases
- URL: http://arxiv.org/abs/2512.23265v1
- Date: Mon, 29 Dec 2025 07:45:36 GMT
- Title: On the Inverse Flow Matching Problem in the One-Dimensional and Gaussian Cases
- Authors: Alexander Korotin, Gudmund Pammer,
- Abstract summary: This paper studies the inverse problem of flow matching (FM) between distributions with finite exponential moment.<n>It is motivated by modern generative AI applications such as the distillation of flow matching models.
- Score: 58.228512978259026
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper studies the inverse problem of flow matching (FM) between distributions with finite exponential moment, a problem motivated by modern generative AI applications such as the distillation of flow matching models. Uniqueness of the solution is established in two cases - the one-dimensional setting and the Gaussian case. The general multidimensional problem remains open for future studies.
Related papers
- Error Analysis of Bayesian Inverse Problems with Generative Priors [9.276062058338443]
We present an analysis for such problems by presenting quantitative error bounds for minimum Wasserstein-2 generative models for the prior.<n>We show that under some assumptions, the error in the posterior due to the generative prior will inherit the same rate as the prior with respect to the Wasserstein-1 distance.
arXiv Detail & Related papers (2026-01-24T08:45:27Z) - Diffusion models for inverse problems [57.87606622211111]
We review the various different approaches that were proposed over the years.<n>We cover the extension to more challenging situations, including blind cases, high-dimensional data, and problems under data scarcity and distribution mismatch.
arXiv Detail & Related papers (2025-08-04T01:26:06Z) - Exact Evaluation of the Accuracy of Diffusion Models for Inverse Problems with Gaussian Data Distributions [0.0]
We investigate the accuracy of diffusion models when applied to a Gaussian data distribution for deblurring.<n>Within this constrained context, we are able to precisely analyze the discrepancy between the theoretical resolution of inverse problems and their resolution obtained using diffusion models.<n>Our findings allow for the comparison of different algorithms from the literature.
arXiv Detail & Related papers (2025-07-09T16:36:51Z) - FlowDPS: Flow-Driven Posterior Sampling for Inverse Problems [51.99765487172328]
Posterior sampling for inverse problem solving can be effectively achieved using flows.<n>Flow-Driven Posterior Sampling (FlowDPS) outperforms state-of-the-art alternatives.
arXiv Detail & Related papers (2025-03-11T07:56:14Z) - Solving Linear Inverse Problems Provably via Posterior Sampling with
Latent Diffusion Models [98.95988351420334]
We present the first framework to solve linear inverse problems leveraging pre-trained latent diffusion models.
We theoretically analyze our algorithm showing provable sample recovery in a linear model setting.
arXiv Detail & Related papers (2023-07-02T17:21:30Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z) - Differentiable Gaussianization Layers for Inverse Problems Regularized by Deep Generative Models [5.439020425819001]
We show that latent tensors of deep generative models can fall out of the desired high-dimensional standard Gaussian distribution during inversion.
Our approach achieves state-of-the-art performance in terms of accuracy and consistency.
arXiv Detail & Related papers (2021-12-07T17:53:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.