Entropic optimal transport beyond product reference couplings: the Gaussian case on Euclidean space
- URL: http://arxiv.org/abs/2507.01709v1
- Date: Wed, 02 Jul 2025 13:40:21 GMT
- Title: Entropic optimal transport beyond product reference couplings: the Gaussian case on Euclidean space
- Authors: Paul Freulon, Nikitas Georgakis, Victor Panaretos,
- Abstract summary: We argue that flexibility in terms of the reference measure can be important in statistical contexts.<n>We show in numerical examples that choosing a suitable reference plan allows to reduce the bias caused by the entropic penalty.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The optimal transport problem with squared Euclidean cost consists in finding a coupling between two input measures that maximizes correlation. Consequently, the optimal coupling is often singular with respect to Lebesgue measure. Regularizing the optimal transport problem with an entropy term yields an approximation called entropic optimal transport. Entropic penalties steer the induced coupling toward a reference measure with desired properties. For instance, when seeking a diffuse coupling, the most popular reference measures are the Lebesgue measure and the product of the two input measures. In this work, we study the case where the reference coupling is not necessarily assumed to be a product. We focus on the Gaussian case as a motivating paradigm, and provide a reduction of this more general optimal transport criterion to a matrix optimization problem. This reduction enables us to provide a complete description of the solution, both in terms of the primal variable and the dual variables. We argue that flexibility in terms of the reference measure can be important in statistical contexts, for instance when one has prior information, when there is uncertainty regarding the measures to be coupled, or to reduce bias when the entropic problem is used to estimate the un-regularized transport problem. In particular, we show in numerical examples that choosing a suitable reference plan allows to reduce the bias caused by the entropic penalty.
Related papers
- Non-convex entropic mean-field optimization via Best Response flow [0.0]
We discuss the problem of minimizing non- functionals on the space probability measures, regularized by the relative entropy (KL) with respect to a fixed reference measure.<n>We show how to choose the regularizer, given the non functional, so that the Best Response becomes a contraction with respect to the $L1$Wasserstein distance.
arXiv Detail & Related papers (2025-05-28T18:22:08Z) - Asymptotically Optimal Change Detection for Unnormalized Pre- and Post-Change Distributions [65.38208224389027]
This paper addresses the problem of detecting changes when only unnormalized pre- and post-change distributions are accessible.<n>Our approach is based on the estimation of the Cumulative Sum statistics, which is known to produce optimal performance.
arXiv Detail & Related papers (2024-10-18T17:13:29Z) - Conditional Optimal Transport on Function Spaces [53.9025059364831]
We develop a theory of constrained optimal transport problems that describe block-triangular Monge maps.
This generalizes the theory of optimal triangular transport to separable infinite-dimensional function spaces with general cost functions.
We present numerical experiments that demonstrate the computational applicability of our theoretical results for amortized and likelihood-free inference of functional parameters.
arXiv Detail & Related papers (2023-11-09T18:44:42Z) - On a Relation Between the Rate-Distortion Function and Optimal Transport [25.59334941818991]
We show that a function defined via an extremal entropic OT distance is equivalent to the rate-distortion function.
We numerically verify this result as well as previous results that connect the Monge and Kantorovich problems to optimal scalar quantization.
arXiv Detail & Related papers (2023-07-01T06:20:23Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - An improved central limit theorem and fast convergence rates for
entropic transportation costs [13.9170193921377]
We prove a central limit theorem for the entropic transportation cost between subgaussian probability measures.
We complement these results with new, faster, convergence rates for the expected entropic transportation cost between empirical measures.
arXiv Detail & Related papers (2022-04-19T19:26:59Z) - Nearly Tight Convergence Bounds for Semi-discrete Entropic Optimal
Transport [0.483420384410068]
We derive nearly tight and non-asymptotic convergence bounds for solutions of entropic semi-discrete optimal transport.
Our results also entail a non-asymptotic and tight expansion of the difference between the entropic and the unregularized costs.
arXiv Detail & Related papers (2021-10-25T06:52:45Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Semi-Discrete Optimal Transport: Hardness, Regularization and Numerical
Solution [8.465228064780748]
We prove that computing the Wasserstein distance between a discrete probability measure supported on two points is already #P-hard.
We introduce a distributionally robust dual optimal transport problem whose objective function is smoothed with the most adverse disturbance distributions.
We show that smoothing the dual objective function is equivalent to regularizing the primal objective function.
arXiv Detail & Related papers (2021-03-10T18:53:59Z) - Distributed Averaging Methods for Randomized Second Order Optimization [54.51566432934556]
We consider distributed optimization problems where forming the Hessian is computationally challenging and communication is a bottleneck.
We develop unbiased parameter averaging methods for randomized second order optimization that employ sampling and sketching of the Hessian.
We also extend the framework of second order averaging methods to introduce an unbiased distributed optimization framework for heterogeneous computing systems.
arXiv Detail & Related papers (2020-02-16T09:01:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.