Plug-in estimation of Schrödinger bridges
- URL: http://arxiv.org/abs/2408.11686v1
- Date: Wed, 21 Aug 2024 15:07:25 GMT
- Title: Plug-in estimation of Schrödinger bridges
- Authors: Aram-Alexandre Pooladian, Jonathan Niles-Weed,
- Abstract summary: We propose a procedure for estimating the Schr"odinger bridge between two probability distributions.
We show that our proposal, which we call the emphSinkhorn bridge, provably estimates the Schr"odinger bridge with a rate of convergence that depends on the intrinsic dimensionality of the target measure.
- Score: 15.685006881635209
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a procedure for estimating the Schr\"odinger bridge between two probability distributions. Unlike existing approaches, our method does not require iteratively simulating forward and backward diffusions or training neural networks to fit unknown drifts. Instead, we show that the potentials obtained from solving the static entropic optimal transport problem between the source and target samples can be modified to yield a natural plug-in estimator of the time-dependent drift that defines the bridge between two measures. Under minimal assumptions, we show that our proposal, which we call the \emph{Sinkhorn bridge}, provably estimates the Schr\"odinger bridge with a rate of convergence that depends on the intrinsic dimensionality of the target measure. Our approach combines results from the areas of sampling, and theoretical and statistical entropic optimal transport.
Related papers
- BM$^2$: Coupled Schrödinger Bridge Matching [4.831663144935879]
We introduce a simple emphnon-iterative approach for learning Schr"odinger bridges with neural networks.
A preliminary theoretical analysis of the convergence properties of BM$2$ is carried out, supported by numerical experiments.
arXiv Detail & Related papers (2024-09-14T08:57:46Z) - Localized Schrödinger Bridge Sampler [0.276240219662896]
We consider the generative problem of sampling from an unknown distribution for which only a sufficiently large number of training samples are available.
A key bottleneck of this approach is the exponential dependence of the required training samples on the dimension, $d$, of the ambient state space.
We propose a localization strategy which exploits conditional independence of conditional expectation values.
arXiv Detail & Related papers (2024-09-12T12:02:51Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems.
We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Diffusion Bridge Mixture Transports, Schr\"odinger Bridge Problems and
Generative Modeling [4.831663144935879]
We propose a novel sampling-based iterative algorithm, the iterated diffusion bridge mixture (IDBM) procedure, aimed at solving the dynamic Schr"odinger bridge problem.
The IDBM procedure exhibits the attractive property of realizing a valid transport between the target probability measures at each iteration.
arXiv Detail & Related papers (2023-04-03T12:13:42Z) - Robust probabilistic inference via a constrained transport metric [8.85031165304586]
We offer a novel alternative by constructing an exponentially tilted empirical likelihood carefully designed to concentrate near a parametric family of distributions.
The proposed approach finds applications in a wide variety of robust inference problems, where we intend to perform inference on the parameters associated with the centering distribution.
We demonstrate superior performance of our methodology when compared against state-of-the-art robust Bayesian inference methods.
arXiv Detail & Related papers (2023-03-17T16:10:06Z) - Near-optimal estimation of smooth transport maps with kernel
sums-of-squares [81.02564078640275]
Under smoothness conditions, the squared Wasserstein distance between two distributions could be efficiently computed with appealing statistical error upper bounds.
The object of interest for applications such as generative modeling is the underlying optimal transport map.
We propose the first tractable algorithm for which the statistical $L2$ error on the maps nearly matches the existing minimax lower-bounds for smooth map estimation.
arXiv Detail & Related papers (2021-12-03T13:45:36Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Deep Generative Learning via Schr\"{o}dinger Bridge [14.138796631423954]
We learn a generative model via entropy with a Schr"odinger Bridge.
We show that the generative model via Schr"odinger Bridge is comparable with state-of-the-art GANs.
arXiv Detail & Related papers (2021-06-19T03:35:42Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.