Enhance Information Propagation for Graph Neural Network by
Heterogeneous Aggregations
- URL: http://arxiv.org/abs/2102.04064v1
- Date: Mon, 8 Feb 2021 08:57:56 GMT
- Title: Enhance Information Propagation for Graph Neural Network by
Heterogeneous Aggregations
- Authors: Dawei Leng, Jinjiang Guo, Lurong Pan, Jie Li, Xinyu Wang
- Abstract summary: Graph neural networks are emerging as continuation of deep learning success w.r.t. graph data.
We propose to enhance information propagation among GNN layers by combining heterogeneous aggregations.
We empirically validate the effectiveness of HAG-Net on a number of graph classification benchmarks.
- Score: 7.3136594018091134
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks are emerging as continuation of deep learning success
w.r.t. graph data. Tens of different graph neural network variants have been
proposed, most following a neighborhood aggregation scheme, where the node
features are updated via aggregating features of its neighboring nodes from
layer to layer. Though related research surges, the power of GNNs are still not
on-par-with their counterpart CNNs in computer vision and RNNs in natural
language processing. We rethink this problem from the perspective of
information propagation, and propose to enhance information propagation among
GNN layers by combining heterogeneous aggregations. We argue that as richer
information are propagated from shallow to deep layers, the discriminative
capability of features formulated by GNN can benefit from it. As our first
attempt in this direction, a new generic GNN layer formulation and upon this a
new GNN variant referred as HAG-Net is proposed. We empirically validate the
effectiveness of HAG-Net on a number of graph classification benchmarks, and
elaborate all the design options and criterions along with.
Related papers
- Conditional Local Feature Encoding for Graph Neural Networks [14.983942698240293]
Graph neural networks (GNNs) have shown great success in learning from graph-based data.
The key mechanism of current GNNs is message passing, where a node's feature is updated based on the information passing from its local neighbourhood.
We propose conditional local feature encoding (CLFE) to help prevent the problem of node features being dominated by information from local neighbourhood.
arXiv Detail & Related papers (2024-05-08T01:51:19Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Two-level Graph Neural Network [15.014364222532347]
We propose a novel GNN framework, referred to as the Two-level GNN (TL-GNN)
This merges subgraph-level information with node-level information.
Experiments show that TL-GNN outperforms existing GNNs and achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-01-03T02:15:20Z) - Network In Graph Neural Network [9.951298152023691]
We present a model-agnostic methodology that allows arbitrary GNN models to increase their model capacity by making the model deeper.
Instead of adding or widening GNN layers, NGNN deepens a GNN model by inserting non-linear feedforward neural network layer(s) within each GNN layer.
arXiv Detail & Related papers (2021-11-23T03:58:56Z) - Feature Correlation Aggregation: on the Path to Better Graph Neural
Networks [37.79964911718766]
Prior to the introduction of Graph Neural Networks (GNNs), modeling and analyzing irregular data, particularly graphs, was thought to be the Achilles' heel of deep learning.
This paper introduces a central node permutation variant function through a frustratingly simple and innocent-looking modification to the core operation of a GNN.
A tangible boost in performance of the model is observed where the model surpasses previous state-of-the-art results by a significant margin while employing fewer parameters.
arXiv Detail & Related papers (2021-09-20T05:04:26Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.