GEDAN: Learning the Edit Costs for Graph Edit Distance
- URL: http://arxiv.org/abs/2508.03111v1
- Date: Tue, 05 Aug 2025 05:44:28 GMT
- Title: GEDAN: Learning the Edit Costs for Graph Edit Distance
- Authors: Francesco Leonardi, Markus Orsi, Jean-Louis Reymond, Kaspar Riesen,
- Abstract summary: We present a novel Graph Neural Network framework that approximates Graph Edit Distance (GED) using both supervised and unsupervised training.<n>A core component of our architecture is the integration of a Generalized Additive Model, which allows the flexible and interpretable learning of context-aware edit costs.
- Score: 0.4866486451835401
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Edit Distance (GED) is defined as the minimum cost transformation of one graph into another and is a widely adopted metric for measuring the dissimilarity between graphs. The major problem of GED is that its computation is NP-hard, which has in turn led to the development of various approximation methods, including approaches based on neural networks (NN). Most of these NN-based models simplify the problem of GED by assuming unit-cost edit operations, a rather unrealistic constraint in real-world applications. In this work, we present a novel Graph Neural Network framework that approximates GED using both supervised and unsupervised training. In the unsupervised setting, it employs a gradient-only self-organizing mechanism that enables optimization without ground-truth distances. Moreover, a core component of our architecture is the integration of a Generalized Additive Model, which allows the flexible and interpretable learning of context-aware edit costs. Experimental results show that the proposed method achieves similar results as state-of-the-art reference methods, yet significantly improves both adaptability and interpretability. That is, the learned cost function offers insights into complex graph structures, making it particularly valuable in domains such as molecular analysis and structural pattern discovery.
Related papers
- GRAIL: Graph Edit Distance and Node Alignment Using LLM-Generated Code [11.73546901244934]
Graph Edit Distance (GED) is a widely used metric for measuring similarity between two graphs.<n> neural methods have achieved improved approximation quality compared to non-neural approaches.<n>GRAIL employs a novel combination of large language models (LLMs) and automated prompt tuning to generate a program that is used to compute GED.
arXiv Detail & Related papers (2025-05-04T14:14:24Z) - Flexible Graph Similarity Computation With A Proactive Optimization Strategy [22.212014309562427]
Graph Edit Distance (GED) offers a principled and flexible measure of graph similarity.<n>GED quantifies the minimum cost needed to transform one graph into another with customizable edit operation costs.<n>Existing methods struggle with adapting to varying operation costs.<n>We propose GEN, a novel learning-based approach for flexible GED approximation.
arXiv Detail & Related papers (2025-04-09T02:16:46Z) - Community-Centric Graph Unlearning [10.906555492206959]
We propose a novel Graph Structure Mapping Unlearning paradigm (GSMU) and a novel method based on it named Community-centric Graph Eraser (CGE)<n>CGE maps community subgraphs to nodes, thereby enabling the reconstruction of a node-level unlearning operation within a reduced mapped graph.
arXiv Detail & Related papers (2024-08-19T05:37:35Z) - EUGENE: Explainable Unsupervised Approximation of Graph Edit Distance with Generalized Edit Costs [20.3519249026886]
Graph Edit Distance (GED) is preferred for measuring inter-graph distance.<n>GED computation is hindered by NP-hardness.<n>Unsupervised methods often face challenges in providing accurate approximations.
arXiv Detail & Related papers (2024-02-08T18:23:05Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - EPIC: Graph Augmentation with Edit Path Interpolation via Learnable Cost [12.191001329584502]
We propose EPIC (Edit Path Interpolation via learnable Cost), a novel-based method for augmenting graph datasets.<n>To interpolate between two graphs lying in an irregular domain, EPIC builds an edit path that represents the transformation process between two graphs via edit operations.<n>Our approach outperforms existing augmentation techniques in many tasks.
arXiv Detail & Related papers (2023-06-02T07:19:07Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural
Networks [52.566735716983956]
We propose a graph gradual pruning framework termed CGP to dynamically prune GNNs.
Unlike LTH-based methods, the proposed CGP approach requires no re-training, which significantly reduces the computation costs.
Our proposed strategy greatly improves both training and inference efficiency while matching or even exceeding the accuracy of existing methods.
arXiv Detail & Related papers (2022-07-18T14:23:31Z) - Combinatorial Learning of Graph Edit Distance via Dynamic Embedding [108.49014907941891]
This paper presents a hybrid approach by combing the interpretability of traditional search-based techniques for producing the edit path.
Inspired by dynamic programming, node-level embedding is designated in a dynamic reuse fashion and suboptimal branches are encouraged to be pruned.
Experimental results on different graph datasets show that our approach can remarkably ease the search process of A* without sacrificing much accuracy.
arXiv Detail & Related papers (2020-11-30T17:41:02Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Graph Representation Learning via Graphical Mutual Information
Maximization [86.32278001019854]
We propose a novel concept, Graphical Mutual Information (GMI), to measure the correlation between input graphs and high-level hidden representations.
We develop an unsupervised learning model trained by maximizing GMI between the input and output of a graph neural encoder.
arXiv Detail & Related papers (2020-02-04T08:33:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.