Bilinear Graph Neural Network with Neighbor Interactions
- URL: http://arxiv.org/abs/2002.03575v5
- Date: Sat, 30 May 2020 02:41:36 GMT
- Title: Bilinear Graph Neural Network with Neighbor Interactions
- Authors: Hongmin Zhu, Fuli Feng, Xiangnan He, Xiang Wang, Yan Li, Kai Zheng,
Yongdong Zhang
- Abstract summary: Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
- Score: 106.80781016591577
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Network (GNN) is a powerful model to learn representations and
make predictions on graph data. Existing efforts on GNN have largely defined
the graph convolution as a weighted sum of the features of the connected nodes
to form the representation of the target node. Nevertheless, the operation of
weighted sum assumes the neighbor nodes are independent of each other, and
ignores the possible interactions between them. When such interactions exist,
such as the co-occurrence of two neighbor nodes is a strong signal of the
target node's characteristics, existing GNN models may fail to capture the
signal. In this work, we argue the importance of modeling the interactions
between neighbor nodes in GNN. We propose a new graph convolution operator,
which augments the weighted sum with pairwise interactions of the
representations of neighbor nodes. We term this framework as Bilinear Graph
Neural Network (BGNN), which improves GNN representation ability with bilinear
interactions between neighbor nodes. In particular, we specify two BGNN models
named BGCN and BGAT, based on the well-known GCN and GAT, respectively.
Empirical results on three public benchmarks of semi-supervised node
classification verify the effectiveness of BGNN -- BGCN (BGAT) outperforms GCN
(GAT) by 1.6% (1.5%) in classification accuracy.Codes are available at:
https://github.com/zhuhm1996/bgnn.
Related papers
- 2-hop Neighbor Class Similarity (2NCS): A graph structural metric
indicative of graph neural network performance [4.051099980410583]
Graph Neural Networks (GNNs) achieve state-of-the-art performance on graph-structured data across numerous domains.
On heterophilous graphs, in which different-type nodes are likely connected, GNNs perform less consistently.
We introduce 2-hop Neighbor Class Similarity (2NCS), a new quantitative graph structural property that correlates with GNN performance more strongly and consistently than alternative metrics.
arXiv Detail & Related papers (2022-12-26T16:16:51Z) - High-Order Pooling for Graph Neural Networks with Tensor Decomposition [23.244580796300166]
Graph Neural Networks (GNNs) are attracting growing attention due to their effectiveness and flexibility in modeling a variety of graph-structured data.
We propose the Graphized Neural Network (tGNN), a highly expressive GNN architecture relying on tensor decomposition to model high-order non-linear node interactions.
arXiv Detail & Related papers (2022-05-24T01:12:54Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Label-Consistency based Graph Neural Networks for Semi-supervised Node
Classification [47.753422069515366]
Graph neural networks (GNNs) achieve remarkable success in graph-based semi-supervised node classification.
In this paper, we propose label-consistency based graph neural network(LC-GNN), leveraging node pairs unconnected but with the same labels to enlarge the receptive field of nodes in GNNs.
Experiments on benchmark datasets demonstrate the proposed LC-GNN outperforms traditional GNNs in graph-based semi-supervised node classification.
arXiv Detail & Related papers (2020-07-27T11:17:46Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.