Gromov-Wasserstein Graph Coarsening
- URL: http://arxiv.org/abs/2511.08733v1
- Date: Thu, 13 Nov 2025 01:04:44 GMT
- Title: Gromov-Wasserstein Graph Coarsening
- Authors: Carlos A. Taveras, Santiago Segarra, César A. Uribe,
- Abstract summary: We study the problem of graph coarsening within the Gromov-Wasserstein geometry.<n>We propose two algorithms that leverage a novel representation of the distortion induced by merging pairs of nodes.
- Score: 36.62418750027048
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study the problem of graph coarsening within the Gromov-Wasserstein geometry. Specifically, we propose two algorithms that leverage a novel representation of the distortion induced by merging pairs of nodes. The first method, termed Greedy Pair Coarsening (GPC), iteratively merges pairs of nodes that locally minimize a measure of distortion until the desired size is achieved. The second method, termed $k$-means Greedy Pair Coarsening (KGPC), leverages clustering based on pairwise distortion metrics to directly merge clusters of nodes. We provide conditions guaranteeing optimal coarsening for our methods and validate their performance on six large-scale datasets and a downstream clustering task. Results show that the proposed methods outperform existing approaches on a wide range of parameters and scenarios.
Related papers
- An Enhanced Model-based Approach for Short Text Clustering [58.60681789677676]
Short text clustering has become increasingly important with the popularity of social media like Twitter, Google+, and Facebook.<n>Existing methods can be broadly categorized into two paradigms: topic model-based approaches and deep representation learning-based approaches.<n>We propose a collapsed Gibbs Sampling algorithm for the Dirichlet Multinomial Mixture model (GSDMM), which effectively handles the sparsity and high dimensionality of short texts.<n>Based on several aspects of GSDMM that warrant further refinement, we propose an improved approach, GSDMM+, designed to further optimize its performance.
arXiv Detail & Related papers (2025-07-18T10:07:42Z) - AH-UGC: Adaptive and Heterogeneous-Universal Graph Coarsening [4.831609704970507]
We introduce a novel framework that combines Locality Sensitive Hashing (LSH) with Consistent Hashing to enable $textitadaptive graph coarsening$.<n>Our approach is the first unified framework to support both adaptive and heterogeneous coarsening.
arXiv Detail & Related papers (2025-05-18T09:57:33Z) - A Greedy Strategy for Graph Cut [95.2841574410968]
We propose a greedy strategy to solve the problem of Graph Cut, called GGC.<n>It starts from the state where each data sample is regarded as a cluster and dynamically merges the two clusters.<n>GGC has a nearly linear computational complexity with respect to the number of samples.
arXiv Detail & Related papers (2024-12-28T05:49:42Z) - An efficient search-and-score algorithm for ancestral graphs using multivariate information scores [0.0]
We propose a greedy search-and-score algorithm for ancestral graphs, which include directed as well as bidirected edges.<n>For computational efficiency, the proposed two-step algorithm relies on local information scores limited to the close surrounding nodes.<n>This computational strategy, although restricted to information contributions from ac-connected subsets containing up to two-collider paths, is shown to outperform state-of-the-art causal discovery methods on challenging benchmark datasets.
arXiv Detail & Related papers (2024-12-23T12:09:14Z) - HeNCler: Node Clustering in Heterophilous Graphs via Learned Asymmetric Similarity [48.62389920549271]
HeNCler learns a similarity graph by optimizing a clustering-specific objective based on weighted kernel singular value decomposition.<n>Our approach enables spectral clustering on an asymmetric similarity graph, providing flexibility for both directed and undirected graphs.
arXiv Detail & Related papers (2024-05-27T11:04:05Z) - Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - MeanCut: A Greedy-Optimized Graph Clustering via Path-based Similarity
and Degree Descent Criterion [0.6906005491572401]
spectral clustering is popular and attractive due to the remarkable performance, easy implementation, and strong adaptability.
We propose MeanCut as the objective function and greedily optimize it in degree descending order for a nondestructive graph partition.
The validity of our algorithm is demonstrated by testifying on real-world benchmarks and application of face recognition.
arXiv Detail & Related papers (2023-12-07T06:19:39Z) - Dink-Net: Neural Clustering on Large Graphs [59.10189693120368]
A deep graph clustering method (Dink-Net) is proposed with the idea of dilation and shrink.
By discriminating nodes, whether being corrupted by augmentations, representations are learned in a self-supervised manner.
The clustering distribution is optimized by minimizing the proposed cluster dilation loss and cluster shrink loss.
Compared to the runner-up, Dink-Net 9.62% achieves NMI improvement on the ogbn-papers100M dataset with 111 million nodes and 1.6 billion edges.
arXiv Detail & Related papers (2023-05-28T15:33:24Z) - Efficient Graph Field Integrators Meet Point Clouds [59.27295475120132]
We present two new classes of algorithms for efficient field integration on graphs encoding point clouds.
The first class, SeparatorFactorization(SF), leverages the bounded genus of point cloud mesh graphs, while the second class, RFDiffusion(RFD), uses popular epsilon-nearest-neighbor graph representations for point clouds.
arXiv Detail & Related papers (2023-02-02T08:33:36Z) - Robust Hypergraph Clustering via Convex Relaxation of Truncated MLE [12.805268849262246]
We study hypergraph clustering in the weighted $d$uniform hypergraph block model ($d$textsf-WHSBM)
We propose a new hypergraph clustering algorithm, called textsfCRTMLE, and provide its performance guarantee under the $d$textsf-WHSBM for general parameter regimes.
arXiv Detail & Related papers (2020-03-23T01:02:28Z) - Efficient Clustering for Stretched Mixtures: Landscape and Optimality [4.2111286819721485]
This paper considers a canonical clustering problem where one receives unlabeled samples drawn from a balanced mixture of two elliptical distributions.
We show that the non-optimal clustering function exhibits desirable geometric properties when the sample size exceeds some constant statistical objectives.
arXiv Detail & Related papers (2020-03-22T17:57:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.