Adaptive Propagation Graph Convolutional Network
- URL: http://arxiv.org/abs/2002.10306v3
- Date: Mon, 28 Sep 2020 09:28:34 GMT
- Title: Adaptive Propagation Graph Convolutional Network
- Authors: Indro Spinelli, Simone Scardapane, Aurelio Uncini
- Abstract summary: Graph convolutional networks (GCNs) are a family of neural network models that perform inference on graph data.
We show that state-of-the-art results can be achieved by adapting the number of communication steps independently at every node.
We show that the proposed adaptive propagation GCN (AP-GCN) achieves superior or similar results to the best proposed models.
- Score: 17.41698818541144
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional networks (GCNs) are a family of neural network models
that perform inference on graph data by interleaving vertex-wise operations and
message-passing exchanges across nodes. Concerning the latter, two key
questions arise: (i) how to design a differentiable exchange protocol (e.g., a
1-hop Laplacian smoothing in the original GCN), and (ii) how to characterize
the trade-off in complexity with respect to the local updates. In this paper,
we show that state-of-the-art results can be achieved by adapting the number of
communication steps independently at every node. In particular, we endow each
node with a halting unit (inspired by Graves' adaptive computation time) that
after every exchange decides whether to continue communicating or not. We show
that the proposed adaptive propagation GCN (AP-GCN) achieves superior or
similar results to the best proposed models so far on a number of benchmarks,
while requiring a small overhead in terms of additional parameters. We also
investigate a regularization term to enforce an explicit trade-off between
communication and accuracy. The code for the AP-GCN experiments is released as
an open-source library.
Related papers
- Rethinking Graph Transformer Architecture Design for Node Classification [4.497245600377944]
Graph Transformer (GT) is a special type of Graph Neural Networks (GNNs) that utilize multi-head attention to facilitate high-order message passing.
In this work, we conduct observational experiments to explore the adaptability of the GT architecture in node classification tasks.
Our proposed GT architecture can effectively adapt to node classification tasks without being affected by global noise and computational efficiency limitations.
arXiv Detail & Related papers (2024-10-15T02:08:16Z) - Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Transfer Entropy in Graph Convolutional Neural Networks [0.0]
Graph Convolutional Networks (GCN) are Graph Neural Networks where the convolutions are applied over a graph.
In this study, we address two important challenges related to GCNs: i.
Oversmoothing is the degradation of the discriminative capacity of nodes as a result of repeated aggregations.
We propose a new strategy for addressing these challenges in GCNs based on Transfer Entropy (TE), which measures of the amount of directed transfer of information between two time varying nodes.
arXiv Detail & Related papers (2024-06-08T20:09:17Z) - Every Node Counts: Improving the Training of Graph Neural Networks on
Node Classification [9.539495585692007]
We propose novel objective terms for the training of GNNs for node classification.
Our first term seeks to maximize the mutual information between node and label features.
Our second term promotes anisotropic smoothness in the prediction maps.
arXiv Detail & Related papers (2022-11-29T23:25:14Z) - Binary Graph Convolutional Network with Capacity Exploration [58.99478502486377]
We propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node attributes.
Our Bi-GCN can reduce the memory consumption by an average of 31x for both the network parameters and input data, and accelerate the inference speed by an average of 51x.
arXiv Detail & Related papers (2022-10-24T12:05:17Z) - Graph Ordering Attention Networks [22.468776559433614]
Graph Neural Networks (GNNs) have been successfully used in many problems involving graph-structured data.
We introduce the Graph Ordering Attention (GOAT) layer, a novel GNN component that captures interactions between nodes in a neighborhood.
GOAT layer demonstrates its increased performance in modeling graph metrics that capture complex information.
arXiv Detail & Related papers (2022-04-11T18:13:19Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - Bi-GCN: Binary Graph Convolutional Network [57.733849700089955]
We propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node features.
Our Bi-GCN can reduce the memory consumption by an average of 30x for both the network parameters and input data, and accelerate the inference speed by an average of 47x.
arXiv Detail & Related papers (2020-10-15T07:26:23Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Unifying Graph Convolutional Neural Networks and Label Propagation [73.82013612939507]
We study the relationship between LPA and GCN in terms of two aspects: feature/label smoothing and feature/label influence.
Based on our theoretical analysis, we propose an end-to-end model that unifies GCN and LPA for node classification.
Our model can also be seen as learning attention weights based on node labels, which is more task-oriented than existing feature-based attention models.
arXiv Detail & Related papers (2020-02-17T03:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.