NCGNN: Node-level Capsule Graph Neural Network
- URL: http://arxiv.org/abs/2012.03476v1
- Date: Mon, 7 Dec 2020 06:46:17 GMT
- Title: NCGNN: Node-level Capsule Graph Neural Network
- Authors: Rui Yang, Wenrui Dai, Chenglin Li, Junni Zou, Hongkai Xiong
- Abstract summary: Node-level Capsule Graph Neural Network (NCGNN) represents nodes as groups of capsules.
novel dynamic routing procedure is developed to adaptively select appropriate capsules for aggregation.
NCGNN can well address the over-smoothing issue and outperforms the state of the arts by producing better node embeddings for classification.
- Score: 45.23653314235767
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Message passing has evolved as an effective tool for designing Graph Neural
Networks (GNNs). However, most existing works naively sum or average all the
neighboring features to update node representations, which suffers from the
following limitations: (1) lack of interpretability to identify crucial node
features for GNN's prediction; (2) over-smoothing issue where repeated
averaging aggregates excessive noise, making features of nodes in different
classes over-mixed and thus indistinguishable. In this paper, we propose the
Node-level Capsule Graph Neural Network (NCGNN) to address these issues with an
improved message passing scheme. Specifically, NCGNN represents nodes as groups
of capsules, in which each capsule extracts distinctive features of its
corresponding node. For each node-level capsule, a novel dynamic routing
procedure is developed to adaptively select appropriate capsules for
aggregation from a subgraph identified by the designed graph filter.
Consequently, as only the advantageous capsules are aggregated and harmful
noise is restrained, over-mixing features of interacting nodes in different
classes tends to be avoided to relieve the over-smoothing issue. Furthermore,
since the graph filter and the dynamic routing identify a subgraph and a subset
of node features that are most influential for the prediction of the model,
NCGNN is inherently interpretable and exempt from complex post-hoc
explanations. Extensive experiments on six node classification benchmarks
demonstrate that NCGNN can well address the over-smoothing issue and
outperforms the state of the arts by producing better node embeddings for
classification.
Related papers
- SF-GNN: Self Filter for Message Lossless Propagation in Deep Graph Neural Network [38.669815079957566]
Graph Neural Network (GNN) with the main idea of encoding graph structure information of graphs by propagation and aggregation has developed rapidly.
It achieved excellent performance in representation learning of multiple types of graphs such as homogeneous graphs, heterogeneous graphs, and more complex graphs like knowledge graphs.
For the phenomenon of performance degradation in deep GNNs, we propose a new perspective.
arXiv Detail & Related papers (2024-07-03T02:40:39Z) - Conditional Local Feature Encoding for Graph Neural Networks [14.983942698240293]
Graph neural networks (GNNs) have shown great success in learning from graph-based data.
The key mechanism of current GNNs is message passing, where a node's feature is updated based on the information passing from its local neighbourhood.
We propose conditional local feature encoding (CLFE) to help prevent the problem of node features being dominated by information from local neighbourhood.
arXiv Detail & Related papers (2024-05-08T01:51:19Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - NDGGNET-A Node Independent Gate based Graph Neural Networks [6.155450481110693]
For nodes with sparse connectivity, it is difficult to obtain enough information through a single GNN layer.
In this thesis, we define a novel framework that allows the normal GNN model to accommodate more layers.
Experimental results show that our proposed model can effectively increase the model depth and perform well on several datasets.
arXiv Detail & Related papers (2022-05-11T08:51:04Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Graph Pointer Neural Networks [11.656981519694218]
We present Graph Pointer Neural Networks (GPNN) to tackle the challenges mentioned above.
We leverage a pointer network to select the most relevant nodes from a large amount of multi-hop neighborhoods.
The GPNN significantly improves the classification performance of state-of-the-art methods.
arXiv Detail & Related papers (2021-10-03T10:18:25Z) - Node2Seq: Towards Trainable Convolutions in Graph Neural Networks [59.378148590027735]
We propose a graph network layer, known as Node2Seq, to learn node embeddings with explicitly trainable weights for different neighboring nodes.
For a target node, our method sorts its neighboring nodes via attention mechanism and then employs 1D convolutional neural networks (CNNs) to enable explicit weights for information aggregation.
In addition, we propose to incorporate non-local information for feature learning in an adaptive manner based on the attention scores.
arXiv Detail & Related papers (2021-01-06T03:05:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.