Graph-Bert: Only Attention is Needed for Learning Graph Representations
- URL: http://arxiv.org/abs/2001.05140v2
- Date: Wed, 22 Jan 2020 15:16:10 GMT
- Title: Graph-Bert: Only Attention is Needed for Learning Graph Representations
- Authors: Jiawei Zhang, Haopeng Zhang, Congying Xia, Li Sun
- Abstract summary: The dominant graph neural networks (GNNs) over-rely on the graph links, causing serious performance problems.
In this paper, we will introduce a new graph neural network, namely GRAPH-BERT (Graph based BERT)
The experimental results have demonstrated that GRAPH-BERT can out-perform the existing GNNs in both the learning effectiveness and efficiency.
- Score: 22.031852733026
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The dominant graph neural networks (GNNs) over-rely on the graph links,
several serious performance problems with which have been witnessed already,
e.g., suspended animation problem and over-smoothing problem. What's more, the
inherently inter-connected nature precludes parallelization within the graph,
which becomes critical for large-sized graph, as memory constraints limit
batching across the nodes. In this paper, we will introduce a new graph neural
network, namely GRAPH-BERT (Graph based BERT), solely based on the attention
mechanism without any graph convolution or aggregation operators. Instead of
feeding GRAPH-BERT with the complete large input graph, we propose to train
GRAPH-BERT with sampled linkless subgraphs within their local contexts.
GRAPH-BERT can be learned effectively in a standalone mode. Meanwhile, a
pre-trained GRAPH-BERT can also be transferred to other application tasks
directly or with necessary fine-tuning if any supervised label information or
certain application oriented objective is available. We have tested the
effectiveness of GRAPH-BERT on several graph benchmark datasets. Based the
pre-trained GRAPH-BERT with the node attribute reconstruction and structure
recovery tasks, we further fine-tune GRAPH-BERT on node classification and
graph clustering tasks specifically. The experimental results have demonstrated
that GRAPH-BERT can out-perform the existing GNNs in both the learning
effectiveness and efficiency.
Related papers
- Hypergraph-enhanced Dual Semi-supervised Graph Classification [14.339207883093204]
We propose a Hypergraph-Enhanced DuAL framework named HEAL for semi-supervised graph classification.
To better explore the higher-order relationships among nodes, we design a hypergraph structure learning to adaptively learn complex node dependencies.
Based on the learned hypergraph, we introduce a line graph to capture the interaction between hyperedges.
arXiv Detail & Related papers (2024-05-08T02:44:13Z) - G-Retriever: Retrieval-Augmented Generation for Textual Graph Understanding and Question Answering [61.93058781222079]
We develop a flexible question-answering framework targeting real-world textual graphs.
We introduce the first retrieval-augmented generation (RAG) approach for general textual graphs.
G-Retriever performs RAG over a graph by formulating this task as a Prize-Collecting Steiner Tree optimization problem.
arXiv Detail & Related papers (2024-02-12T13:13:04Z) - Train Your Own GNN Teacher: Graph-Aware Distillation on Textual Graphs [37.48313839125563]
We develop a Graph-Aware Distillation framework (GRAD) to encode graph structures into an LM for graph-free, fast inference.
Different from conventional knowledge distillation, GRAD jointly optimize a GNN teacher and a graph-free student over the graph's nodes via a shared LM.
Experiments in eight node classification benchmarks in both transductive and inductive settings showcase GRAD's superiority over existing distillation approaches for textual graphs.
arXiv Detail & Related papers (2023-04-20T22:34:20Z) - An Empirical Study of Retrieval-enhanced Graph Neural Networks [48.99347386689936]
Graph Neural Networks (GNNs) are effective tools for graph representation learning.
We propose a retrieval-enhanced scheme called GRAPHRETRIEVAL, which is agnostic to the choice of graph neural network models.
We conduct comprehensive experiments over 13 datasets, and we observe that GRAPHRETRIEVAL is able to reach substantial improvements over existing GNNs.
arXiv Detail & Related papers (2022-06-01T09:59:09Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Graph Neural Distance Metric Learning with Graph-Bert [10.879701971582502]
We will introduce a new graph neural network based distance metric learning approaches, namely GB-DISTANCE (GRAPH-BERT based Neural Distance)
GB-DISTANCE can learn graph representations effectively based on a pre-trained GRAPH-BERT model.
In addition, GB-DISTANCE can also maintain the distance metric basic properties.
arXiv Detail & Related papers (2020-02-09T18:58:31Z) - Segmented Graph-Bert for Graph Instance Modeling [10.879701971582502]
We will examine the effectiveness of GRAPH-BERT on graph instance representation learning.
In this paper, we re-design it with a segmented architecture instead, which is also named as SEG-BERT.
We have tested the effectiveness of SEG-BERT with experiments on seven graph instance benchmark datasets.
arXiv Detail & Related papers (2020-02-09T04:55:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.