Computing Approximate Graph Edit Distance via Optimal Transport
- URL: http://arxiv.org/abs/2412.18857v1
- Date: Wed, 25 Dec 2024 09:55:14 GMT
- Title: Computing Approximate Graph Edit Distance via Optimal Transport
- Authors: Qihao Cheng, Da Yan, Tianhao Wu, Zhongyi Huang, Qin Zhang,
- Abstract summary: Given a graph pair $(G1, G2)$, graph edit distance (GED) is defined as the minimum number of edit operations converting $G1$ to $G2$.
GEDIOT is based on inverse optimal transport that leverages a learnable Sinkhorn algorithm to generate the coupling matrix.
GEDGW, models GED computation as a linear combination of optimal transport and its variant, Gromov-Wasserstein discrepancy, for node and edge operations.
- Score: 16.327678462502668
- License:
- Abstract: Given a graph pair $(G^1, G^2)$, graph edit distance (GED) is defined as the minimum number of edit operations converting $G^1$ to $G^2$. GED is a fundamental operation widely used in many applications, but its exact computation is NP-hard, so the approximation of GED has gained a lot of attention. Data-driven learning-based methods have been found to provide superior results compared to classical approximate algorithms, but they directly fit the coupling relationship between a pair of vertices from their vertex features. We argue that while pairwise vertex features can capture the coupling cost (discrepancy) of a pair of vertices, the vertex coupling matrix should be derived from the vertex-pair cost matrix through a more well-established method that is aware of the global context of the graph pair, such as optimal transport. In this paper, we propose an ensemble approach that integrates a supervised learning-based method and an unsupervised method, both based on optimal transport. Our learning method, GEDIOT, is based on inverse optimal transport that leverages a learnable Sinkhorn algorithm to generate the coupling matrix. Our unsupervised method, GEDGW, models GED computation as a linear combination of optimal transport and its variant, Gromov-Wasserstein discrepancy, for node and edge operations, respectively, which can be solved efficiently without needing the ground truth. Our ensemble method, GEDHOT, combines GEDIOT and GEDGW to further boost the performance. Extensive experiments demonstrate that our methods significantly outperform the existing methods in terms of the performance of GED computation, edit path generation, and model generalizability.
Related papers
- Near-Optimal Online Learning for Multi-Agent Submodular Coordination: Tight Approximation and Communication Efficiency [52.60557300927007]
We present a $textbfMA-OSMA$ algorithm to transfer the discrete submodular problem into a continuous optimization.
We also introduce a projection-free $textbfMA-OSEA$ algorithm, which effectively utilizes the KL divergence by mixing a uniform distribution.
Our algorithms significantly improve the $(frac11+c)$-approximation provided by the state-of-the-art OSG algorithm.
arXiv Detail & Related papers (2025-02-07T15:57:56Z) - Graph Similarity Computation via Interpretable Neural Node Alignment [19.34487413883093]
Graph Edit Distance (GED) and Maximum Common Subgraphs (MCS) are commonly used metrics to evaluate graph similarity in practice.
Deep learning models are well-known black boxes'', thus the typically characteristic one-to-one node/subgraph alignment process cannot be seen.
We propose a novel interpretable neural node alignment model without relying on node alignment ground truth information.
arXiv Detail & Related papers (2024-12-13T10:23:27Z) - Adaptive Principal Components Allocation with the $\ell_{2,g}$-regularized Gaussian Graphical Model for Efficient Fine-Tuning Large Models [7.6656660956453635]
We propose a novel -Efficient Fine-ning (PEFT) approach based on Gaussian Graphical Models (GGMs)
We demonstrate the effectiveness of the proposed approach, achieving competitive performance with significantly fewer trainable parameters.
arXiv Detail & Related papers (2024-12-11T18:11:21Z) - GDSG: Graph Diffusion-based Solution Generator for Optimization Problems in MEC Networks [109.17835015018532]
We present a Graph Diffusion-based Solution Generation (GDSG) method.
This approach is designed to work with suboptimal datasets while converging to the optimal solution large probably.
We build GDSG as a multi-task diffusion model utilizing a Graph Neural Network (GNN) to acquire the distribution of high-quality solutions.
arXiv Detail & Related papers (2024-12-11T11:13:43Z) - Efficient Graph Similarity Computation with Alignment Regularization [7.143879014059894]
Graph similarity computation (GSC) is a learning-based prediction task using Graph Neural Networks (GNNs)
We show that high-quality learning can be attained with a simple yet powerful regularization technique, which we call the Alignment Regularization (AReg)
In the inference stage, the graph-level representations learned by the GNN encoder are directly used to compute the similarity score without using AReg again to speed up inference.
arXiv Detail & Related papers (2024-06-21T07:37:28Z) - Combining Optimal Transport and Embedding-Based Approaches for More Expressiveness in Unsupervised Graph Alignment [19.145556156889064]
Unsupervised graph alignment finds the one-to-one node correspondence between a pair of attributed graphs by only exploiting graph structure and node features.
We propose a principled approach to combine their advantages motivated by theoretical analysis of model expressiveness.
We are the first to guarantee the one-to-one matching constraint by reducing the problem to maximum weight matching.
arXiv Detail & Related papers (2024-06-19T04:57:35Z) - Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - MATA*: Combining Learnable Node Matching with A* Algorithm for
Approximate Graph Edit Distance Computation [12.437507185260577]
The exact Graph Edit Distance (GED) computation is known to be NP-complete.
We present a data-driven hybrid approach MATA* for approximate GED computation based on Graph Neural Networks (GNNs) and A* algorithms.
arXiv Detail & Related papers (2023-11-04T09:33:08Z) - Leveraging Non-uniformity in First-order Non-convex Optimization [93.6817946818977]
Non-uniform refinement of objective functions leads to emphNon-uniform Smoothness (NS) and emphNon-uniform Lojasiewicz inequality (NL)
New definitions inspire new geometry-aware first-order methods that converge to global optimality faster than the classical $Omega (1/t2)$ lower bounds.
arXiv Detail & Related papers (2021-05-13T04:23:07Z) - Deep Reinforcement Learning of Graph Matching [63.469961545293756]
Graph matching (GM) under node and pairwise constraints has been a building block in areas from optimization to computer vision.
We present a reinforcement learning solver for GM i.e. RGM that seeks the node correspondence between pairwise graphs.
Our method differs from the previous deep graph matching model in the sense that they are focused on the front-end feature extraction and affinity function learning.
arXiv Detail & Related papers (2020-12-16T13:48:48Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.