Partial Gromov-Wasserstein Metric
- URL: http://arxiv.org/abs/2402.03664v3
- Date: Thu, 26 Sep 2024 00:56:20 GMT
- Title: Partial Gromov-Wasserstein Metric
- Authors: Yikun Bai, Rocio Diaz Martin, Abihith Kothapalli, Hengrong Du, Xinran Liu, Soheil Kolouri,
- Abstract summary: The Gromov-Wasserstein (GW) distance has gained increasing interest in the machine learning community in recent years.
We propose a particular case of the UGW problem, termed Partial Gromov-Wasserstein (PGW)
- Score: 8.503892585556901
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Gromov-Wasserstein (GW) distance has gained increasing interest in the machine learning community in recent years, as it allows for the comparison of measures in different metric spaces. To overcome the limitations imposed by the equal mass requirements of the classical GW problem, researchers have begun exploring its application in unbalanced settings. However, Unbalanced GW (UGW) can only be regarded as a discrepancy rather than a rigorous metric/distance between two metric measure spaces (mm-spaces). In this paper, we propose a particular case of the UGW problem, termed Partial Gromov-Wasserstein (PGW). We establish that PGW is a well-defined metric between mm-spaces and discuss its theoretical properties, including the existence of a minimizer for the PGW problem and the relationship between PGW and GW, among others. We then propose two variants of the Frank-Wolfe algorithm for solving the PGW problem and show that they are mathematically and computationally equivalent. Moreover, based on our PGW metric, we introduce the analogous concept of barycenters for mm-spaces. Finally, we validate the effectiveness of our PGW metric and related solvers in applications such as shape matching, shape retrieval, and shape interpolation, comparing them against existing baselines.
Related papers
- Metric properties of partial and robust Gromov-Wasserstein distances [3.9485589956945204]
The Gromov-Wasserstein (GW) distances define a family of metrics, based on ideas from optimal transport.
GW distances are inherently sensitive to outlier noise and cannot accommodate partial matching.
We show that our new distances define true metrics, that they induce the same topology as the GW distances, and that they enjoy additional robustness to perturbations.
arXiv Detail & Related papers (2024-11-04T15:53:45Z) - GWQ: Gradient-Aware Weight Quantization for Large Language Models [61.17678373122165]
gradient-aware weight quantization (GWQ) is the first quantization approach for low-bit weight quantization that leverages gradients to localize outliers.
GWQ retains the corresponding to the top 1% outliers preferentially at FP16 precision, while the remaining non-outlier weights are stored in a low-bit format.
In the zero-shot task, GWQ quantized models have higher accuracy compared to other quantization methods.
arXiv Detail & Related papers (2024-10-30T11:16:04Z) - Linear Partial Gromov-Wasserstein Embedding [8.23887869467319]
The Gromov Wasserstein (GW) problem has attracted growing interest in the machine learning and data science communities.
We propose the linear partial Gromov Wasserstein embedding, a linearized embedding technique for the PGW problem.
Similar to the linearization technique for the classical OT problem, we prove that LPGW defines a valid metric for metric measure spaces.
arXiv Detail & Related papers (2024-10-22T03:54:52Z) - Uncovering Challenges of Solving the Continuous Gromov-Wasserstein Problem [63.99794069984492]
The Gromov-Wasserstein Optimal Transport (GWOT) problem has attracted the special attention of the ML community.
We crash-test existing continuous GWOT approaches on different scenarios, carefully record and analyze the obtained results, and identify issues.
We propose a new continuous GWOT method which does not rely on discrete techniques and partially solves some of the problems of competitors.
arXiv Detail & Related papers (2023-03-10T15:21:12Z) - Outlier-Robust Gromov-Wasserstein for Graph Data [31.895380224961464]
We introduce a new and robust version of the Gromov-Wasserstein (GW) distance called RGW.
RGW features optimistically perturbed marginal constraints within a Kullback-Leibler divergence-based ambiguity set.
We demonstrate the effectiveness of RGW on real-world graph learning tasks, such as subgraph matching and partial shape correspondence.
arXiv Detail & Related papers (2023-02-09T12:57:29Z) - Markovian Sliced Wasserstein Distances: Beyond Independent Projections [51.80527230603978]
We introduce a new family of SW distances, named Markovian sliced Wasserstein (MSW) distance, which imposes a first-order Markov structure on projecting directions.
We compare distances with previous SW variants in various applications such as flows, color transfer, and deep generative modeling to demonstrate the favorable performance of MSW.
arXiv Detail & Related papers (2023-01-10T01:58:15Z) - Learning to Solve PDE-constrained Inverse Problems with Graph Networks [51.89325993156204]
In many application domains across science and engineering, we are interested in solving inverse problems with constraints defined by a partial differential equation (PDE)
Here we explore GNNs to solve such PDE-constrained inverse problems.
We demonstrate computational speedups of up to 90x using GNNs compared to principled solvers.
arXiv Detail & Related papers (2022-06-01T18:48:01Z) - Efficient Approximation of Gromov-Wasserstein Distance using Importance
Sparsification [34.865115235547286]
We propose a novel importance sparsification method, called Spar-GW, to approximate Gromov-Wasserstein distance efficiently.
We demonstrate that the proposed Spar-GW method is applicable to the GW distance with arbitrary ground cost.
In addition, this method can be extended to approximate the variants of GW distance, including the entropic GW distance, the fused GW distance, and the unbalanced GW distance.
arXiv Detail & Related papers (2022-05-26T18:30:40Z) - Quantized Gromov-Wasserstein [10.592277756185046]
Quantized Gromov Wasserstein (qGW) is a metric that treats parts as fundamental objects and fits into a hierarchy of theoretical upper bounds for the problem.
We develop an algorithm for approximating optimal GW matchings which yields algorithmic speedups and reductions in memory complexity.
We are able to go beyond outperforming state-of-the-art and apply GW matching at scales that are an order of magnitude larger than in the existing literature.
arXiv Detail & Related papers (2021-04-05T17:03:20Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - Fast and Robust Comparison of Probability Measures in Heterogeneous
Spaces [62.35667646858558]
We introduce the Anchor Energy (AE) and Anchor Wasserstein (AW) distances, which are respectively the energy and Wasserstein distances instantiated on such representations.
Our main contribution is to propose a sweep line algorithm to compute AE emphexactly in log-quadratic time, where a naive implementation would be cubic.
We show that AE and AW perform well in various experimental settings at a fraction of the computational cost of popular GW approximations.
arXiv Detail & Related papers (2020-02-05T03:09:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.