The statistical effect of entropic regularization in optimal
transportation
- URL: http://arxiv.org/abs/2006.05199v2
- Date: Mon, 15 Jun 2020 12:49:28 GMT
- Title: The statistical effect of entropic regularization in optimal
transportation
- Authors: Eustasio del Barrio and Jean-Michel Loubes
- Abstract summary: We provide a closed form for the regularized optimal transport which enables to provide a better understanding of the effect of the regularization from a statistical framework.
- Score: 6.269377544160702
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose to tackle the problem of understanding the effect of
regularization in Sinkhorn algotihms. In the case of Gaussian distributions we
provide a closed form for the regularized optimal transport which enables to
provide a better understanding of the effect of the regularization from a
statistical framework.
Related papers
- Optimal Transport Adapter Tuning for Bridging Modality Gaps in Few-Shot Remote Sensing Scene Classification [80.83325513157637]
Few-Shot Remote Sensing Scene Classification (FS-RSSC) presents the challenge of classifying remote sensing images with limited labeled samples.
We propose a novel Optimal Transport Adapter Tuning (OTAT) framework aimed at constructing an ideal Platonic representational space.
arXiv Detail & Related papers (2025-03-19T07:04:24Z) - Partial Distribution Alignment via Adaptive Optimal Transport [11.167177175327359]
We propose adaptive optimal transport which is distinctive from the classical optimal transport in its ability of adaptive-mass preserving.
We instantiate the adaptive optimal transport in machine learning application to align source and target distributions partially and adaptively.
arXiv Detail & Related papers (2025-03-07T02:13:04Z) - Asymptotically Optimal Change Detection for Unnormalized Pre- and Post-Change Distributions [65.38208224389027]
This paper addresses the problem of detecting changes when only unnormalized pre- and post-change distributions are accessible.
Our approach is based on the estimation of the Cumulative Sum statistics, which is known to produce optimal performance.
arXiv Detail & Related papers (2024-10-18T17:13:29Z) - Uncertainty Quantification via Stable Distribution Propagation [60.065272548502]
We propose a new approach for propagating stable probability distributions through neural networks.
Our method is based on local linearization, which we show to be an optimal approximation in terms of total variation distance for the ReLU non-linearity.
arXiv Detail & Related papers (2024-02-13T09:40:19Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Conditional Optimal Transport on Function Spaces [53.9025059364831]
We develop a theory of constrained optimal transport problems that describe block-triangular Monge maps.
This generalizes the theory of optimal triangular transport to separable infinite-dimensional function spaces with general cost functions.
We present numerical experiments that demonstrate the computational applicability of our theoretical results for amortized and likelihood-free inference of functional parameters.
arXiv Detail & Related papers (2023-11-09T18:44:42Z) - Constrained Reweighting of Distributions: an Optimal Transport Approach [8.461214317999321]
We introduce a nonparametrically imbued distributional constraints on the weights, and develop a general framework leveraging the maximum entropy principle and tools from optimal transport.
The framework is demonstrated in the context of three disparate applications: portfolio allocation, semi-parametric inference for complex surveys, and ensuring algorithmic fairness in machine learning algorithms.
arXiv Detail & Related papers (2023-10-19T03:54:31Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Convergence rate of Tsallis entropic regularized optimal transport [0.0]
We establish fundamental results such as the $Gamma$-convergence of the Tsallis regularized optimal transport to the Monge--Kantorovich problem as the regularization parameter tends to zero.
We show that the KL regularization achieves the fastest convergence rate in the Tsallis framework.
arXiv Detail & Related papers (2023-04-13T15:37:14Z) - New Perspectives on Regularization and Computation in Optimal
Transport-Based Distributionally Robust Optimization [8.564319625930892]
We study optimal transport-based distributionally robust optimization problems where a fictitious adversary, often envisioned as nature, can choose the distribution of the uncertain problem parameters by a prescribed reference distribution at a finite transportation cost.
arXiv Detail & Related papers (2023-03-07T13:52:32Z) - An improved central limit theorem and fast convergence rates for
entropic transportation costs [13.9170193921377]
We prove a central limit theorem for the entropic transportation cost between subgaussian probability measures.
We complement these results with new, faster, convergence rates for the expected entropic transportation cost between empirical measures.
arXiv Detail & Related papers (2022-04-19T19:26:59Z) - Generalization Properties of Stochastic Optimizers via Trajectory
Analysis [48.38493838310503]
We show that both the Fernique-Talagrand functional and the local powerlaw are predictive of generalization performance.
We show that both our Fernique-Talagrand functional and the local powerlaw are predictive of generalization performance.
arXiv Detail & Related papers (2021-08-02T10:58:32Z) - Statistical Optimal Transport posed as Learning Kernel Embedding [0.0]
This work takes the novel approach of posing statistical Optimal Transport (OT) as that of learning the transport plan's kernel mean embedding from sample based estimates of marginal embeddings.
A key result is that, under very mild conditions, $epsilon$-optimal recovery of the transport plan as well as the Barycentric-projection based transport map is possible with a sample complexity that is completely dimension-free.
arXiv Detail & Related papers (2020-02-08T14:58:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.