Graph Edit Distance Learning via Different Attention
- URL: http://arxiv.org/abs/2308.13871v1
- Date: Sat, 26 Aug 2023 13:05:01 GMT
- Title: Graph Edit Distance Learning via Different Attention
- Authors: Jiaxi Lv, Liang Zhang, Yi Huang, Jiancheng Huang, Shifeng Chen
- Abstract summary: This paper proposes a novel graph-level fusion module Different Attention (DiffAtt)
DiffAtt uses the difference between two graph-level embeddings as an attentional mechanism to capture the graph structural difference of the two graphs.
Based on DiffAtt, a new GSC method, named Graph Edit Distance Learning via Different Attention (REDRAFT), is proposed.
- Score: 11.79198639644178
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Recently, more and more research has focused on using Graph Neural Networks
(GNN) to solve the Graph Similarity Computation problem (GSC), i.e., computing
the Graph Edit Distance (GED) between two graphs. These methods treat GSC as an
end-to-end learnable task, and the core of their architecture is the feature
fusion modules to interact with the features of two graphs. Existing methods
consider that graph-level embedding is difficult to capture the differences in
local small structures between two graphs, and thus perform fine-grained
feature fusion on node-level embedding can improve the accuracy, but leads to
greater time and memory consumption in the training and inference phases.
However, this paper proposes a novel graph-level fusion module Different
Attention (DiffAtt), and demonstrates that graph-level fusion embeddings can
substantially outperform these complex node-level fusion embeddings. We posit
that the relative difference structure of the two graphs plays an important
role in calculating their GED values. To this end, DiffAtt uses the difference
between two graph-level embeddings as an attentional mechanism to capture the
graph structural difference of the two graphs. Based on DiffAtt, a new GSC
method, named Graph Edit Distance Learning via Different Attention (REDRAFT),
is proposed, and experimental results demonstrate that REDRAFT achieves
state-of-the-art performance in 23 out of 25 metrics in five benchmark
datasets. Especially on MSE, it respectively outperforms the second best by
19.9%, 48.8%, 29.1%, 31.6%, and 2.2%. Moreover, we propose a quantitative test
Remaining Subgraph Alignment Test (RESAT) to verify that among all graph-level
fusion modules, the fusion embedding generated by DiffAtt can best capture the
structural differences between two graphs.
Related papers
- InstructG2I: Synthesizing Images from Multimodal Attributed Graphs [50.852150521561676]
We propose a graph context-conditioned diffusion model called InstructG2I.
InstructG2I first exploits the graph structure and multimodal information to conduct informative neighbor sampling.
A Graph-QFormer encoder adaptively encodes the graph nodes into an auxiliary set of graph prompts to guide the denoising process.
arXiv Detail & Related papers (2024-10-09T17:56:15Z) - A Simple and Scalable Graph Neural Network for Large Directed Graphs [11.792826520370774]
We investigate various combinations of node representations and edge direction awareness within an input graph.
In response, we propose a simple yet holistic classification method A2DUG.
We demonstrate that A2DUG stably performs well on various datasets and improves the accuracy up to 11.29 compared with the state-of-the-art methods.
arXiv Detail & Related papers (2023-06-14T06:24:58Z) - Subgraph Networks Based Contrastive Learning [5.736011243152416]
Graph contrastive learning (GCL) can solve the problem of annotated data scarcity.
Most existing GCL methods focus on the design of graph augmentation strategies and mutual information estimation operations.
We propose a novel framework called subgraph network-based contrastive learning (SGNCL)
arXiv Detail & Related papers (2023-06-06T08:52:44Z) - CGMN: A Contrastive Graph Matching Network for Self-Supervised Graph
Similarity Learning [65.1042892570989]
We propose a contrastive graph matching network (CGMN) for self-supervised graph similarity learning.
We employ two strategies, namely cross-view interaction and cross-graph interaction, for effective node representation learning.
We transform node representations into graph-level representations via pooling operations for graph similarity computation.
arXiv Detail & Related papers (2022-05-30T13:20:26Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - Graph Contrastive Learning with Augmentations [109.23158429991298]
We propose a graph contrastive learning (GraphCL) framework for learning unsupervised representations of graph data.
We show that our framework can produce graph representations of similar or better generalizability, transferrability, and robustness compared to state-of-the-art methods.
arXiv Detail & Related papers (2020-10-22T20:13:43Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.