Bi-Level Attention Graph Neural Networks
- URL: http://arxiv.org/abs/2304.11533v1
- Date: Sun, 23 Apr 2023 04:18:56 GMT
- Title: Bi-Level Attention Graph Neural Networks
- Authors: Roshni G. Iyer, Wei Wang, Yizhou Sun
- Abstract summary: We present Bi-Level Attention Graph Neural Networks (BA-GNN), scalable neural networks (NNs) that use a novel bi-level graph attention mechanism.
BA-GNN models both node-node and relation-relation interactions in a personalized way, by hierarchically attending to both types of information from local neighborhood contexts.
- Score: 31.543368116788706
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent graph neural networks (GNNs) with the attention mechanism have
historically been limited to small-scale homogeneous graphs (HoGs). However,
GNNs handling heterogeneous graphs (HeGs), which contain several entity and
relation types, all have shortcomings in handling attention. Most GNNs that
learn graph attention for HeGs learn either node-level or relation-level
attention, but not both, limiting their ability to predict both important
entities and relations in the HeG. Even the best existing method that learns
both levels of attention has the limitation of assuming graph relations are
independent and that its learned attention disregards this dependency
association. To effectively model both multi-relational and multi-entity
large-scale HeGs, we present Bi-Level Attention Graph Neural Networks (BA-GNN),
scalable neural networks (NNs) that use a novel bi-level graph attention
mechanism. BA-GNN models both node-node and relation-relation interactions in a
personalized way, by hierarchically attending to both types of information from
local neighborhood contexts instead of the global graph context. Rigorous
experiments on seven real-world HeGs show BA-GNN consistently outperforms all
baselines, and demonstrate quality and transferability of its learned
relation-level attention to improve performance of other GNNs.
Related papers
- Graph as a feature: improving node classification with non-neural graph-aware logistic regression [2.952177779219163]
Graph-aware Logistic Regression (GLR) is a non-neural model designed for node classification tasks.
Unlike traditional graph algorithms that use only a fraction of the information accessible to GNNs, our proposed model simultaneously leverages both node features and the relationships between entities.
arXiv Detail & Related papers (2024-11-19T08:32:14Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Hierarchical Attention Models for Multi-Relational Graphs [40.143721982295915]
We present Bi-Level Attention-Based Graph Convolutional Networks (BR-GCN)
BR-GCN models use bi-level attention to learn node embeddings through (1) node-level attention, and (2) relation-level attention.
We show that BR-GCN's attention mechanism is both scalable and more effective in learning compared to state-of-the-art GNNs.
arXiv Detail & Related papers (2024-04-14T21:37:39Z) - 2-hop Neighbor Class Similarity (2NCS): A graph structural metric
indicative of graph neural network performance [4.051099980410583]
Graph Neural Networks (GNNs) achieve state-of-the-art performance on graph-structured data across numerous domains.
On heterophilous graphs, in which different-type nodes are likely connected, GNNs perform less consistently.
We introduce 2-hop Neighbor Class Similarity (2NCS), a new quantitative graph structural property that correlates with GNN performance more strongly and consistently than alternative metrics.
arXiv Detail & Related papers (2022-12-26T16:16:51Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.