CO-Optimal Transport
- URL: http://arxiv.org/abs/2002.03731v3
- Date: Fri, 6 Nov 2020 14:31:21 GMT
- Title: CO-Optimal Transport
- Authors: Ievgen Redko, Titouan Vayer, R\'emi Flamary, Nicolas Courty
- Abstract summary: Optimal transport (OT) is a powerful tool for finding correspondences and measuring similarity between two distributions.
We propose a novel OT problem, named COOT for CO- Optimal Transport, that simultaneously optimize two transport maps between both samples and features.
We demonstrate its versatility with two machine learning applications in heterogeneous domain adaptation and co-clustering/data summarization.
- Score: 19.267807479856575
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optimal transport (OT) is a powerful geometric and probabilistic tool for
finding correspondences and measuring similarity between two distributions.
Yet, its original formulation relies on the existence of a cost function
between the samples of the two distributions, which makes it impractical when
they are supported on different spaces. To circumvent this limitation, we
propose a novel OT problem, named COOT for CO-Optimal Transport, that
simultaneously optimizes two transport maps between both samples and features,
contrary to other approaches that either discard the individual features by
focusing on pairwise distances between samples or need to model explicitly the
relations between them. We provide a thorough theoretical analysis of our
problem, establish its rich connections with other OT-based distances and
demonstrate its versatility with two machine learning applications in
heterogeneous domain adaptation and co-clustering/data summarization, where
COOT leads to performance improvements over the state-of-the-art methods.
Related papers
- Optimal Transport-Guided Conditional Score-Based Diffusion Models [63.14903268958398]
Conditional score-based diffusion model (SBDM) is for conditional generation of target data with paired data as condition, and has achieved great success in image translation.
To tackle the applications with partially paired or even unpaired dataset, we propose a novel Optimal Transport-guided Conditional Score-based diffusion model (OTCS) in this paper.
arXiv Detail & Related papers (2023-11-02T13:28:44Z) - Energy-Guided Continuous Entropic Barycenter Estimation for General Costs [95.33926437521046]
We propose a novel algorithm for approximating the continuous Entropic OT (EOT) barycenter for arbitrary OT cost functions.
Our approach is built upon the dual reformulation of the EOT problem based on weak OT.
arXiv Detail & Related papers (2023-10-02T11:24:36Z) - Unifying Distributionally Robust Optimization via Optimal Transport
Theory [13.19058156672392]
This paper introduces a novel approach that unifies these methods into a single framework based on optimal transport.
Our proposed approach makes it possible for optimal adversarial distributions to simultaneously perturb likelihood and outcomes.
The paper investigates several duality results and presents tractable reformulations that enhance the practical applicability of this unified framework.
arXiv Detail & Related papers (2023-08-10T08:17:55Z) - Generative Modeling through the Semi-dual Formulation of Unbalanced
Optimal Transport [9.980822222343921]
We propose a novel generative model based on the semi-dual formulation of Unbalanced Optimal Transport (UOT)
Unlike OT, UOT relaxes the hard constraint on distribution matching. This approach provides better robustness against outliers, stability during training, and faster convergence.
Our model outperforms existing OT-based generative models, achieving FID scores of 2.97 on CIFAR-10 and 6.36 on CelebA-HQ-256.
arXiv Detail & Related papers (2023-05-24T06:31:05Z) - Learning Optimal Transport Between two Empirical Distributions with
Normalizing Flows [12.91637880428221]
We propose to leverage the flexibility of neural networks to learn an approximate optimal transport map.
We show that a particular instance of invertible neural networks, namely the normalizing flows, can be used to approximate the solution of this OT problem.
arXiv Detail & Related papers (2022-07-04T08:08:47Z) - Unbalanced CO-Optimal Transport [16.9451175221198]
CO-optimal transport (COOT) takes this comparison further by inferring an alignment between features as well.
We show that it is sensitive to outliers that are omnipresent in real-world data.
This prompts us to propose unbalanced COOT for which we provably show its robustness to noise.
arXiv Detail & Related papers (2022-05-30T08:43:19Z) - Comparing Probability Distributions with Conditional Transport [63.11403041984197]
We propose conditional transport (CT) as a new divergence and approximate it with the amortized CT (ACT) cost.
ACT amortizes the computation of its conditional transport plans and comes with unbiased sample gradients that are straightforward to compute.
On a wide variety of benchmark datasets generative modeling, substituting the default statistical distance of an existing generative adversarial network with ACT is shown to consistently improve the performance.
arXiv Detail & Related papers (2020-12-28T05:14:22Z) - Robust Optimal Transport with Applications in Generative Modeling and
Domain Adaptation [120.69747175899421]
Optimal Transport (OT) distances such as Wasserstein have been used in several areas such as GANs and domain adaptation.
We propose a computationally-efficient dual form of the robust OT optimization that is amenable to modern deep learning applications.
Our approach can train state-of-the-art GAN models on noisy datasets corrupted with outlier distributions.
arXiv Detail & Related papers (2020-10-12T17:13:40Z) - Hierarchical Optimal Transport for Robust Multi-View Learning [97.21355697826345]
Two assumptions may be questionable in practice, which limits the application of multi-view learning.
We propose a hierarchical optimal transport (HOT) method to mitigate the dependency on these two assumptions.
The HOT method is applicable to both unsupervised and semi-supervised learning, and experimental results show that it performs robustly on both synthetic and real-world tasks.
arXiv Detail & Related papers (2020-06-04T22:24:45Z) - Joint Wasserstein Distribution Matching [89.86721884036021]
Joint distribution matching (JDM) problem, which aims to learn bidirectional mappings to match joint distributions of two domains, occurs in many machine learning and computer vision applications.
We propose to address JDM problem by minimizing the Wasserstein distance of the joint distributions in two domains.
We then propose an important theorem to reduce the intractable problem into a simple optimization problem, and develop a novel method to solve it.
arXiv Detail & Related papers (2020-03-01T03:39:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.