AM-GCN: Adaptive Multi-channel Graph Convolutional Networks
- URL: http://arxiv.org/abs/2007.02265v2
- Date: Sat, 11 Jul 2020 03:23:47 GMT
- Title: AM-GCN: Adaptive Multi-channel Graph Convolutional Networks
- Authors: Xiao Wang, Meiqi Zhu, Deyu Bo, Peng Cui, Chuan Shi, Jian Pei
- Abstract summary: We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
- Score: 85.0332394224503
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Convolutional Networks (GCNs) have gained great popularity in tackling
various analytics tasks on graph and network data. However, some recent studies
raise concerns about whether GCNs can optimally integrate node features and
topological structures in a complex graph with rich information. In this paper,
we first present an experimental investigation. Surprisingly, our experimental
results clearly show that the capability of the state-of-the-art GCNs in fusing
node features and topological structures is distant from optimal or even
satisfactory. The weakness may severely hinder the capability of GCNs in some
classification tasks, since GCNs may not be able to adaptively learn some deep
correlation information between topological structures and node features. Can
we remedy the weakness and design a new type of GCNs that can retain the
advantages of the state-of-the-art GCNs and, at the same time, enhance the
capability of fusing topological structures and node features substantially? We
tackle the challenge and propose an adaptive multi-channel graph convolutional
networks for semi-supervised classification (AM-GCN). The central idea is that
we extract the specific and common embeddings from node features, topological
structures, and their combinations simultaneously, and use the attention
mechanism to learn adaptive importance weights of the embeddings. Our extensive
experiments on benchmark data sets clearly show that AM-GCN extracts the most
correlated information from both node features and topological structures
substantially, and improves the classification accuracy with a clear margin.
Related papers
- Learning Invariant Representations of Graph Neural Networks via Cluster
Generalization [58.68231635082891]
Graph neural networks (GNNs) have become increasingly popular in modeling graph-structured data.
In this paper, we experimentally find that the performance of GNNs drops significantly when the structure shift happens.
We propose the Cluster Information Transfer (CIT) mechanism, which can learn invariant representations for GNNs.
arXiv Detail & Related papers (2024-03-06T10:36:56Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - What Do Graph Convolutional Neural Networks Learn? [0.0]
Graph Convolutional Neural Networks (GCN) are a common variant of Graph neural networks (GNNs)
Recent literature has highlighted that GCNs can achieve strong performance on heterophilous graphs under certain "special conditions"
Our investigation on underlying graph structures of a dataset finds that a GCN's SSNC performance is significantly influenced by the consistency and uniqueness in neighborhood structure of nodes within a class.
arXiv Detail & Related papers (2022-07-05T06:44:37Z) - Spiking Graph Convolutional Networks [19.36064180392385]
SpikingGCN is an end-to-end framework that aims to integrate the embedding of GCNs with the biofidelity characteristics of SNNs.
We show that SpikingGCN on a neuromorphic chip can bring a clear advantage of energy efficiency into graph data analysis.
arXiv Detail & Related papers (2022-05-05T16:44:36Z) - Multi-scale Graph Convolutional Networks with Self-Attention [2.66512000865131]
Graph convolutional networks (GCNs) have achieved remarkable learning ability for dealing with various graph structural data.
Over-smoothing phenomenon as a crucial issue of GCNs remains to be solved and investigated.
We propose two novel multi-scale GCN frameworks by incorporating self-attention mechanism and multi-scale information into the design of GCNs.
arXiv Detail & Related papers (2021-12-04T04:41:24Z) - SStaGCN: Simplified stacking based graph convolutional networks [2.556756699768804]
Graph convolutional network (GCN) is a powerful model studied broadly in various graph structural data learning tasks.
We propose a novel GCN called SStaGCN (Simplified stacking based GCN) by utilizing the ideas of stacking and aggregation.
We show that SStaGCN can efficiently mitigate the over-smoothing problem of GCN.
arXiv Detail & Related papers (2021-11-16T05:00:08Z) - Curvature Graph Neural Network [8.477559786537919]
We introduce discrete graph curvature (the Ricci curvature) to quantify the strength of structural connection of pairwise nodes.
We propose Curvature Graph Neural Network (CGNN), which effectively improves the adaptive locality ability of GNNs.
The experimental results on synthetic datasets show that CGNN effectively exploits the topology structure information.
arXiv Detail & Related papers (2021-06-30T00:56:03Z) - Hierarchical Graph Capsule Network [78.4325268572233]
We propose hierarchical graph capsule network (HGCN) that can jointly learn node embeddings and extract graph hierarchies.
To learn the hierarchical representation, HGCN characterizes the part-whole relationship between lower-level capsules (part) and higher-level capsules (whole)
arXiv Detail & Related papers (2020-12-16T04:13:26Z) - DeeperGCN: All You Need to Train Deeper GCNs [66.64739331859226]
Graph Convolutional Networks (GCNs) have been drawing significant attention with the power of representation learning on graphs.
Unlike Convolutional Neural Networks (CNNs), which are able to take advantage of stacking very deep layers, GCNs suffer from vanishing gradient, over-smoothing and over-fitting issues when going deeper.
This paper proposes DeeperGCN that is capable of successfully and reliably training very deep GCNs.
arXiv Detail & Related papers (2020-06-13T23:00:22Z) - Cross-GCN: Enhancing Graph Convolutional Network with $k$-Order Feature
Interactions [153.6357310444093]
Graph Convolutional Network (GCN) is an emerging technique that performs learning and reasoning on graph data.
We argue that existing designs of GCN forgo modeling cross features, making GCN less effective for tasks or data where cross features are important.
We design a new operator named Cross-feature Graph Convolution, which explicitly models the arbitrary-order cross features with complexity linear to feature dimension and order size.
arXiv Detail & Related papers (2020-03-05T13:05:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.