GRAIL: Graph Edit Distance and Node Alignment Using LLM-Generated Code
- URL: http://arxiv.org/abs/2505.02124v1
- Date: Sun, 04 May 2025 14:14:24 GMT
- Title: GRAIL: Graph Edit Distance and Node Alignment Using LLM-Generated Code
- Authors: Samidha Verma, Arushi Goyal, Ananya Mathur, Ankit Anand, Sayan Ranu,
- Abstract summary: Graph Edit Distance (GED) is a widely used metric for measuring similarity between two graphs.<n> neural methods have achieved improved approximation quality compared to non-neural approaches.<n>GRAIL employs a novel combination of large language models (LLMs) and automated prompt tuning to generate a program that is used to compute GED.
- Score: 11.73546901244934
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Edit Distance (GED) is a widely used metric for measuring similarity between two graphs. Computing the optimal GED is NP-hard, leading to the development of various neural and non-neural heuristics. While neural methods have achieved improved approximation quality compared to non-neural approaches, they face significant challenges: (1) They require large amounts of ground truth data, which is itself NP-hard to compute. (2) They operate as black boxes, offering limited interpretability. (3) They lack cross-domain generalization, necessitating expensive retraining for each new dataset. We address these limitations with GRAIL, introducing a paradigm shift in this domain. Instead of training a neural model to predict GED, GRAIL employs a novel combination of large language models (LLMs) and automated prompt tuning to generate a program that is used to compute GED. This shift from predicting GED to generating programs imparts various advantages, including end-to-end interpretability and an autonomous self-evolutionary learning mechanism without ground-truth supervision. Extensive experiments on seven datasets confirm that GRAIL not only surpasses state-of-the-art GED approximation methods in prediction quality but also achieves robust cross-domain generalization across diverse graph distributions.
Related papers
- GEDAN: Learning the Edit Costs for Graph Edit Distance [0.4866486451835401]
We present a novel Graph Neural Network framework that approximates Graph Edit Distance (GED) using both supervised and unsupervised training.<n>A core component of our architecture is the integration of a Generalized Additive Model, which allows the flexible and interpretable learning of context-aware edit costs.
arXiv Detail & Related papers (2025-08-05T05:44:28Z) - Towards Unsupervised Training of Matching-based Graph Edit Distance Solver via Preference-aware GAN [29.471035994169323]
Graph Edit Distance (GED) is a fundamental graph similarity metric widely used in various applications.<n>Recent state-of-the-art hybrid GED solver has shown promising performance.<n>We propose GEDRanker, a novel unsupervised GAN-based framework for GED computation.
arXiv Detail & Related papers (2025-05-16T03:45:31Z) - GraphBridge: Towards Arbitrary Transfer Learning in GNNs [65.01790632978962]
GraphBridge is a novel framework to enable knowledge transfer across disparate tasks and domains in GNNs.<n>It allows for the augmentation of any pre-trained GNN with prediction heads and a bridging network that connects the input to the output layer.<n> Empirical validation, conducted over 16 datasets representative of these scenarios, confirms the framework's capacity for task- and domain-agnostic transfer learning.
arXiv Detail & Related papers (2025-02-26T15:57:51Z) - GraphLoRA: Structure-Aware Contrastive Low-Rank Adaptation for Cross-Graph Transfer Learning [17.85404473268992]
Graph Neural Networks (GNNs) have demonstrated remarkable proficiency in handling a range of graph analytical tasks.<n>Despite their versatility, GNNs face significant challenges in transferability, limiting their utility in real-world applications.<n>We propose GraphLoRA, an effective and parameter-efficient method for transferring well-trained GNNs to diverse graph domains.
arXiv Detail & Related papers (2024-09-25T06:57:42Z) - EUGENE: Explainable Unsupervised Approximation of Graph Edit Distance with Generalized Edit Costs [20.3519249026886]
Graph Edit Distance (GED) is preferred for measuring inter-graph distance.<n>GED computation is hindered by NP-hardness.<n>Unsupervised methods often face challenges in providing accurate approximations.
arXiv Detail & Related papers (2024-02-08T18:23:05Z) - Learning to Reweight for Graph Neural Network [63.978102332612906]
Graph Neural Networks (GNNs) show promising results for graph tasks.
Existing GNNs' generalization ability will degrade when there exist distribution shifts between testing and training graph data.
We propose a novel nonlinear graph decorrelation method, which can substantially improve the out-of-distribution generalization ability.
arXiv Detail & Related papers (2023-12-19T12:25:10Z) - MATA*: Combining Learnable Node Matching with A* Algorithm for
Approximate Graph Edit Distance Computation [12.437507185260577]
The exact Graph Edit Distance (GED) computation is known to be NP-complete.
We present a data-driven hybrid approach MATA* for approximate GED computation based on Graph Neural Networks (GNNs) and A* algorithms.
arXiv Detail & Related papers (2023-11-04T09:33:08Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - GREED: A Neural Framework for Learning Graph Distance Functions [8.323580169763726]
We design a novel siamese graph neural network called GREED, which learns GED and SED in a property-preserving manner.
Through extensive 10 real graph datasets containing up to 7 million edges, we establish that GREED is not only more accurate than the state of the art, but also up to 3 orders of magnitude faster.
arXiv Detail & Related papers (2021-12-24T21:46:40Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.