Graph Joint Attention Networks
- URL: http://arxiv.org/abs/2102.03147v1
- Date: Fri, 5 Feb 2021 12:51:47 GMT
- Title: Graph Joint Attention Networks
- Authors: Tiantian He, Lu Bai, Yew-Soon Ong
- Abstract summary: Graph attention networks (GATs) have been recognized as powerful tools for learning in graph structured data.
We propose Graph Joint Attention Networks (JATs) to address the aforementioned challenge.
JATs adopt novel joint attention mechanisms which can automatically determine the relative significance between node features.
We theoretically analyze the expressive power of JATs and further propose an improved strategy for the joint attention mechanisms.
- Score: 24.258699912448257
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph attention networks (GATs) have been recognized as powerful tools for
learning in graph structured data. However, how to enable the attention
mechanisms in GATs to smoothly consider both structural and feature information
is still very challenging. In this paper, we propose Graph Joint Attention
Networks (JATs) to address the aforementioned challenge. Different from
previous attention-based graph neural networks (GNNs), JATs adopt novel joint
attention mechanisms which can automatically determine the relative
significance between node features and structural coefficients learned from
graph topology, when computing the attention scores. Therefore, representations
concerning more structural properties can be inferred by JATs. Besides, we
theoretically analyze the expressive power of JATs and further propose an
improved strategy for the joint attention mechanisms that enables JATs to reach
the upper bound of expressive power which every message-passing GNN can
ultimately achieve, i.e., 1-WL test. JATs can thereby be seen as most powerful
message-passing GNNs. The proposed neural architecture has been extensively
tested on widely used benchmarking datasets, and has been compared with
state-of-the-art GNNs for various downstream predictive tasks. Experimental
results show that JATs achieve state-of-the-art performance on all the testing
datasets.
Related papers
- TANGNN: a Concise, Scalable and Effective Graph Neural Networks with Top-m Attention Mechanism for Graph Representation Learning [7.879217146851148]
We propose an innovative Graph Neural Network (GNN) architecture that integrates a Top-m attention mechanism aggregation component and a neighborhood aggregation component.
To assess the effectiveness of our proposed model, we have applied it to citation sentiment prediction, a novel task previously unexplored in the GNN field.
arXiv Detail & Related papers (2024-11-23T05:31:25Z) - Representation Learning on Heterophilic Graph with Directional
Neighborhood Attention [8.493802098034255]
Graph Attention Network (GAT) is one of the most popular Graph Neural Network (GNN) architecture.
GAT lacks the ability to capture long-range and global graph information, leading to unsatisfactory performance on some datasets.
We propose Directional Graph Attention Network (DGAT) to combine the feature-based attention with the global directional information extracted from the graph topology.
arXiv Detail & Related papers (2024-03-03T10:59:16Z) - Learning Topological Representations with Bidirectional Graph Attention Network for Solving Job Shop Scheduling Problem [27.904195034688257]
Existing learning-based methods for solving job shop scheduling problems (JSSP) usually use off-the-shelf GNN models tailored to undirected graphs and neglect the rich and meaningful topological structures of disjunctive graphs (DGs)
This paper proposes the topology-aware bidirectional graph attention network (TBGAT) to embed the DG for solving JSSP in a local search framework.
arXiv Detail & Related papers (2024-02-27T15:33:20Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Learnable Graph Convolutional Attention Networks [7.465923786151107]
Graph Neural Networks (GNNs) compute the message exchange between nodes by either aggregating uniformly (convolving) the features of all the neighboring nodes, or by applying a non-uniform score (attending) to the features.
Recent works have shown the strengths and weaknesses of the resulting GNN architectures, respectively, GCNs and GATs.
We introduce the graph convolutional attention layer (CAT), which relies on convolutions to compute the attention scores.
Our results demonstrate that L-CAT is able to efficiently combine different GNN layers along the network, outperforming competing methods in a wide
arXiv Detail & Related papers (2022-11-21T21:08:58Z) - Affinity-Aware Graph Networks [9.888383815189176]
Graph Neural Networks (GNNs) have emerged as a powerful technique for learning on relational data.
We explore the use of affinity measures as features in graph neural networks.
We propose message passing networks based on these features and evaluate their performance on a variety of node and graph property prediction tasks.
arXiv Detail & Related papers (2022-06-23T18:51:35Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.