A contribution to Optimal Transport on incomparable spaces
- URL: http://arxiv.org/abs/2011.04447v1
- Date: Mon, 9 Nov 2020 14:13:52 GMT
- Title: A contribution to Optimal Transport on incomparable spaces
- Authors: Titouan Vayer
- Abstract summary: This thesis proposes to study the complex scenario in which the different data belong to incomparable spaces.
This thesis proposes a set of Optimal Transport tools for these different cases.
- Score: 4.873362301533825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optimal Transport is a theory that allows to define geometrical notions of
distance between probability distributions and to find correspondences,
relationships, between sets of points. Many machine learning applications are
derived from this theory, at the frontier between mathematics and optimization.
This thesis proposes to study the complex scenario in which the different data
belong to incomparable spaces. In particular we address the following
questions: how to define and apply Optimal Transport between graphs, between
structured data? How can it be adapted when the data are varied and not
embedded in the same metric space? This thesis proposes a set of Optimal
Transport tools for these different cases. An important part is notably devoted
to the study of the Gromov-Wasserstein distance whose properties allow to
define interesting transport problems on incomparable spaces. More broadly, we
analyze the mathematical properties of the various proposed tools, we establish
algorithmic solutions to compute them and we study their applicability in
numerous machine learning scenarii which cover, in particular, classification,
simplification, partitioning of structured data, as well as heterogeneous
domain adaptation.
Related papers
- Geometry of the Space of Partitioned Networks: A Unified Theoretical and Computational Framework [3.69102525133732]
"Space of networks" has a complex structure that cannot be adequately described using conventional statistical tools.
We introduce a measure-theoretic formalism for modeling generalized network structures such as graphs, hypergraphs, or graphs whose nodes come with a partition into categorical classes.
We show that our metric is an Alexandrov space of non-negative curvature, and leverage this structure to define gradients for certain functionals commonly arising in geometric data analysis tasks.
arXiv Detail & Related papers (2024-09-10T07:58:37Z) - Linear optimal transport subspaces for point set classification [12.718843888673227]
We propose a framework for classifying point sets experiencing certain types of spatial deformations.
Our approach employs the Linear Optimal Transport (LOT) transform to obtain a linear embedding of set-structured data.
It achieves competitive accuracies compared to state-of-the-art methods across various point set classification tasks.
arXiv Detail & Related papers (2024-03-15T04:39:27Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - InfoOT: Information Maximizing Optimal Transport [58.72713603244467]
InfoOT is an information-theoretic extension of optimal transport.
It maximizes the mutual information between domains while minimizing geometric distances.
This formulation yields a new projection method that is robust to outliers and generalizes to unseen samples.
arXiv Detail & Related papers (2022-10-06T18:55:41Z) - On a class of geodesically convex optimization problems solved via
Euclidean MM methods [50.428784381385164]
We show how a difference of Euclidean convexization functions can be written as a difference of different types of problems in statistics and machine learning.
Ultimately, we helps the broader broader the broader the broader the broader the work.
arXiv Detail & Related papers (2022-06-22T23:57:40Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - Switch Spaces: Learning Product Spaces with Sparse Gating [48.591045282317424]
We propose Switch Spaces, a data-driven approach for learning representations in product space.
We introduce sparse gating mechanisms that learn to choose, combine and switch spaces.
Experiments on knowledge graph completion and item recommendations show that the proposed switch space achieves new state-of-the-art performances.
arXiv Detail & Related papers (2021-02-17T11:06:59Z) - A Trainable Optimal Transport Embedding for Feature Aggregation and its
Relationship to Attention [96.77554122595578]
We introduce a parametrized representation of fixed size, which embeds and then aggregates elements from a given input set according to the optimal transport plan between the set and a trainable reference.
Our approach scales to large datasets and allows end-to-end training of the reference, while also providing a simple unsupervised learning mechanism with small computational cost.
arXiv Detail & Related papers (2020-06-22T08:35:58Z) - Geometric Dataset Distances via Optimal Transport [15.153110906331733]
We propose an alternative notion of distance between datasets that (i) is model-agnostic, (ii) does not involve training, (iii) can compare datasets even if their label sets are completely disjoint and (iv) has solid theoretical footing.
This distance relies on optimal transport, which provides it with rich geometry awareness, interpretable correspondences and well-understood properties.
Our results show that this novel distance provides meaningful comparison of datasets, and correlates well with transfer learning hardness across various experimental settings and datasets.
arXiv Detail & Related papers (2020-02-07T17:51:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.