Plug-in estimation of Schrödinger bridges
- URL: http://arxiv.org/abs/2408.11686v1
- Date: Wed, 21 Aug 2024 15:07:25 GMT
- Title: Plug-in estimation of Schrödinger bridges
- Authors: Aram-Alexandre Pooladian, Jonathan Niles-Weed,
- Abstract summary: We propose a procedure for estimating the Schr"odinger bridge between two probability distributions.
We show that our proposal, which we call the emphSinkhorn bridge, provably estimates the Schr"odinger bridge with a rate of convergence that depends on the intrinsic dimensionality of the target measure.
- Score: 15.685006881635209
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a procedure for estimating the Schr\"odinger bridge between two probability distributions. Unlike existing approaches, our method does not require iteratively simulating forward and backward diffusions or training neural networks to fit unknown drifts. Instead, we show that the potentials obtained from solving the static entropic optimal transport problem between the source and target samples can be modified to yield a natural plug-in estimator of the time-dependent drift that defines the bridge between two measures. Under minimal assumptions, we show that our proposal, which we call the \emph{Sinkhorn bridge}, provably estimates the Schr\"odinger bridge with a rate of convergence that depends on the intrinsic dimensionality of the target measure. Our approach combines results from the areas of sampling, and theoretical and statistical entropic optimal transport.
Related papers
- Latent Schrodinger Bridge: Prompting Latent Diffusion for Fast Unpaired Image-to-Image Translation [58.19676004192321]
Diffusion models (DMs), which enable both image generation from noise and inversion from data, have inspired powerful unpaired image-to-image (I2I) translation algorithms.
We tackle this problem with Schrodinger Bridges (SBs), which are differential equations (SDEs) between distributions with minimal transport cost.
Inspired by this observation, we propose Latent Schrodinger Bridges (LSBs) that approximate the SB ODE via pre-trained Stable Diffusion.
We demonstrate that our algorithm successfully conduct competitive I2I translation in unsupervised setting with only a fraction of cost required by previous DM-
arXiv Detail & Related papers (2024-11-22T11:24:14Z) - BM$^2$: Coupled Schrödinger Bridge Matching [4.831663144935879]
We introduce a simple emphnon-iterative approach for learning Schr"odinger bridges with neural networks.
A preliminary theoretical analysis of the convergence properties of BM$2$ is carried out, supported by numerical experiments.
arXiv Detail & Related papers (2024-09-14T08:57:46Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems.
We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Diffusion Bridge Mixture Transports, Schr\"odinger Bridge Problems and
Generative Modeling [4.831663144935879]
We propose a novel sampling-based iterative algorithm, the iterated diffusion bridge mixture (IDBM) procedure, aimed at solving the dynamic Schr"odinger bridge problem.
The IDBM procedure exhibits the attractive property of realizing a valid transport between the target probability measures at each iteration.
arXiv Detail & Related papers (2023-04-03T12:13:42Z) - Robust probabilistic inference via a constrained transport metric [8.85031165304586]
We offer a novel alternative by constructing an exponentially tilted empirical likelihood carefully designed to concentrate near a parametric family of distributions.
The proposed approach finds applications in a wide variety of robust inference problems, where we intend to perform inference on the parameters associated with the centering distribution.
We demonstrate superior performance of our methodology when compared against state-of-the-art robust Bayesian inference methods.
arXiv Detail & Related papers (2023-03-17T16:10:06Z) - Learning Optimal Transport Between two Empirical Distributions with
Normalizing Flows [12.91637880428221]
We propose to leverage the flexibility of neural networks to learn an approximate optimal transport map.
We show that a particular instance of invertible neural networks, namely the normalizing flows, can be used to approximate the solution of this OT problem.
arXiv Detail & Related papers (2022-07-04T08:08:47Z) - Near-optimal estimation of smooth transport maps with kernel
sums-of-squares [81.02564078640275]
Under smoothness conditions, the squared Wasserstein distance between two distributions could be efficiently computed with appealing statistical error upper bounds.
The object of interest for applications such as generative modeling is the underlying optimal transport map.
We propose the first tractable algorithm for which the statistical $L2$ error on the maps nearly matches the existing minimax lower-bounds for smooth map estimation.
arXiv Detail & Related papers (2021-12-03T13:45:36Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Deep Generative Learning via Schr\"{o}dinger Bridge [14.138796631423954]
We learn a generative model via entropy with a Schr"odinger Bridge.
We show that the generative model via Schr"odinger Bridge is comparable with state-of-the-art GANs.
arXiv Detail & Related papers (2021-06-19T03:35:42Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.