Probabilistic Inverse Optimal Transport
- URL: http://arxiv.org/abs/2112.09754v1
- Date: Fri, 17 Dec 2021 20:33:27 GMT
- Title: Probabilistic Inverse Optimal Transport
- Authors: Wei-Ting Chiu, Pei Wang, Patrick Shafto
- Abstract summary: Optimal transport (OT) formalizes the problem of finding an optimal coupling between probability measures given a cost matrix.
The inverse problem of inferring the cost given a coupling is Inverse Optimal Transport (IOT)
We formalize and systematically analyze the properties of IOT using tools from the study of entropy-regularized OT.
- Score: 11.425633112192521
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optimal transport (OT) formalizes the problem of finding an optimal coupling
between probability measures given a cost matrix. The inverse problem of
inferring the cost given a coupling is Inverse Optimal Transport (IOT). IOT is
less well understood than OT. We formalize and systematically analyze the
properties of IOT using tools from the study of entropy-regularized OT.
Theoretical contributions include characterization of the manifold of
cross-ratio equivalent costs, the implications of model priors, and derivation
of an MCMC sampler. Empirical contributions include visualizations of
cross-ratio equivalent effect on basic examples and simulations validating
theoretical results.
Related papers
- Asymptotically Optimal Change Detection for Unnormalized Pre- and Post-Change Distributions [65.38208224389027]
This paper addresses the problem of detecting changes when only unnormalized pre- and post-change distributions are accessible.
Our approach is based on the estimation of the Cumulative Sum statistics, which is known to produce optimal performance.
arXiv Detail & Related papers (2024-10-18T17:13:29Z) - Unveiling the Statistical Foundations of Chain-of-Thought Prompting Methods [59.779795063072655]
Chain-of-Thought (CoT) prompting and its variants have gained popularity as effective methods for solving multi-step reasoning problems.
We analyze CoT prompting from a statistical estimation perspective, providing a comprehensive characterization of its sample complexity.
arXiv Detail & Related papers (2024-08-25T04:07:18Z) - Lower Complexity Adaptation for Empirical Entropic Optimal Transport [0.0]
Entropic optimal transport (EOT) presents an effective and computationally viable alternative to unregularized optimal transport (OT)
We derive novel statistical bounds for empirical plug-in estimators of the EOT cost.
Our techniques employ empirical process theory and rely on a dual formulation of EOT over a single function class.
arXiv Detail & Related papers (2023-06-23T16:06:13Z) - Robust computation of optimal transport by $\beta$-potential
regularization [79.24513412588745]
Optimal transport (OT) has become a widely used tool in the machine learning field to measure the discrepancy between probability distributions.
We propose regularizing OT with the beta-potential term associated with the so-called $beta$-divergence.
We experimentally demonstrate that the transport matrix computed with our algorithm helps estimate a probability distribution robustly even in the presence of outliers.
arXiv Detail & Related papers (2022-12-26T18:37:28Z) - On the potential benefits of entropic regularization for smoothing Wasserstein estimators [3.194167428464958]
This paper focuses on the study of entropic regularization in optimal transport as a smoothing method for Wasserstein estimators.
We discuss how entropic regularization may reach, at a lower computational cost, statistical performances comparable to those of un-regularized Wasserstein estimators.
arXiv Detail & Related papers (2022-10-13T12:04:36Z) - Learning Optimal Transport Between two Empirical Distributions with
Normalizing Flows [12.91637880428221]
We propose to leverage the flexibility of neural networks to learn an approximate optimal transport map.
We show that a particular instance of invertible neural networks, namely the normalizing flows, can be used to approximate the solution of this OT problem.
arXiv Detail & Related papers (2022-07-04T08:08:47Z) - Low-rank Optimal Transport: Approximation, Statistics and Debiasing [51.50788603386766]
Low-rank optimal transport (LOT) approach advocated in citescetbon 2021lowrank
LOT is seen as a legitimate contender to entropic regularization when compared on properties of interest.
We target each of these areas in this paper in order to cement the impact of low-rank approaches in computational OT.
arXiv Detail & Related papers (2022-05-24T20:51:37Z) - Pseudo-Spherical Contrastive Divergence [119.28384561517292]
We propose pseudo-spherical contrastive divergence (PS-CD) to generalize maximum learning likelihood of energy-based models.
PS-CD avoids the intractable partition function and provides a generalized family of learning objectives.
arXiv Detail & Related papers (2021-11-01T09:17:15Z) - Multi-task learning on the edge: cost-efficiency and theoretical
optimality [0.0]
This article proposes a distributed multi-task learning (MTL) algorithm based on supervised principal component analysis (SPCA)
Supporting experiments on synthetic and real benchmark data demonstrate that significant energy gains can be obtained with no performance loss.
arXiv Detail & Related papers (2021-10-09T19:59:02Z) - Comparing Probability Distributions with Conditional Transport [63.11403041984197]
We propose conditional transport (CT) as a new divergence and approximate it with the amortized CT (ACT) cost.
ACT amortizes the computation of its conditional transport plans and comes with unbiased sample gradients that are straightforward to compute.
On a wide variety of benchmark datasets generative modeling, substituting the default statistical distance of an existing generative adversarial network with ACT is shown to consistently improve the performance.
arXiv Detail & Related papers (2020-12-28T05:14:22Z) - Statistical Optimal Transport posed as Learning Kernel Embedding [0.0]
This work takes the novel approach of posing statistical Optimal Transport (OT) as that of learning the transport plan's kernel mean embedding from sample based estimates of marginal embeddings.
A key result is that, under very mild conditions, $epsilon$-optimal recovery of the transport plan as well as the Barycentric-projection based transport map is possible with a sample complexity that is completely dimension-free.
arXiv Detail & Related papers (2020-02-08T14:58:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.